Diagnosed failure

RollingRestartArgs/RollingRestartITest.TestWorkloads/4: /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:403: Failure
Failed
Timed out waiting for assertion to pass.
I20251024 08:17:02.885730 24176 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:02.887259 24441 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:02.892647 24040 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:03.281037 24308 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:04.111250 18753 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20251024 08:17:04.111372 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 97d4708eb2b64571b34044be6da3d298 and pid 24045
************************ BEGIN STACKS **************************
[New LWP 24048]
[New LWP 24049]
[New LWP 24050]
[New LWP 24051]
[New LWP 24057]
[New LWP 24058]
[New LWP 24059]
[New LWP 24062]
[New LWP 24063]
[New LWP 24064]
[New LWP 24065]
[New LWP 24066]
[New LWP 24067]
[New LWP 24069]
[New LWP 24070]
[New LWP 24071]
[New LWP 24072]
[New LWP 24073]
[New LWP 24074]
[New LWP 24075]
[New LWP 24076]
[New LWP 24077]
[New LWP 24078]
[New LWP 24079]
[New LWP 24080]
[New LWP 24081]
[New LWP 24082]
[New LWP 24083]
[New LWP 24084]
[New LWP 24085]
[New LWP 24086]
[New LWP 24087]
[New LWP 24088]
[New LWP 24089]
[New LWP 24090]
[New LWP 24091]
[New LWP 24092]
[New LWP 24093]
[New LWP 24094]
[New LWP 24095]
[New LWP 24096]
[New LWP 24097]
[New LWP 24098]
[New LWP 24099]
[New LWP 24100]
[New LWP 24101]
[New LWP 24102]
[New LWP 24103]
[New LWP 24104]
[New LWP 24105]
[New LWP 24106]
[New LWP 24107]
[New LWP 24108]
[New LWP 24109]
[New LWP 24110]
[New LWP 24111]
[New LWP 24112]
[New LWP 24113]
[New LWP 24114]
[New LWP 24115]
[New LWP 24116]
[New LWP 24117]
[New LWP 24118]
[New LWP 24119]
[New LWP 24120]
[New LWP 24121]
[New LWP 24122]
[New LWP 24123]
[New LWP 24124]
[New LWP 24125]
[New LWP 24126]
[New LWP 24127]
[New LWP 24128]
[New LWP 24129]
[New LWP 24130]
[New LWP 24131]
[New LWP 24132]
[New LWP 24133]
[New LWP 24134]
[New LWP 24135]
[New LWP 24136]
[New LWP 24137]
[New LWP 24138]
[New LWP 24139]
[New LWP 24140]
[New LWP 24141]
[New LWP 24142]
[New LWP 24143]
[New LWP 24144]
[New LWP 24145]
[New LWP 24146]
[New LWP 24147]
[New LWP 24148]
[New LWP 24149]
[New LWP 24150]
[New LWP 24151]
[New LWP 24152]
[New LWP 24153]
[New LWP 24154]
[New LWP 24155]
[New LWP 24156]
[New LWP 24157]
[New LWP 24158]
[New LWP 24159]
[New LWP 24160]
[New LWP 24161]
[New LWP 24162]
[New LWP 24163]
[New LWP 24164]
[New LWP 24165]
[New LWP 24166]
[New LWP 24167]
[New LWP 24168]
[New LWP 24169]
[New LWP 24170]
[New LWP 24171]
[New LWP 24172]
[New LWP 24173]
[New LWP 24174]
[New LWP 24175]
[New LWP 24176]
[New LWP 24177]
0x00007fce8e741d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 24045 "kudu"  0x00007fce8e741d50 in ?? ()
  2    LWP 24048 "kudu"  0x00007fce8e73dfb9 in ?? ()
  3    LWP 24049 "kudu"  0x00007fce8e73dfb9 in ?? ()
  4    LWP 24050 "kudu"  0x00007fce8e73dfb9 in ?? ()
  5    LWP 24051 "kernel-watcher-" 0x00007fce8e73dfb9 in ?? ()
  6    LWP 24057 "ntp client-2405" 0x00007fce8e7419e2 in ?? ()
  7    LWP 24058 "file cache-evic" 0x00007fce8e73dfb9 in ?? ()
  8    LWP 24059 "sq_acceptor" 0x00007fce8c852cb9 in ?? ()
  9    LWP 24062 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  10   LWP 24063 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  11   LWP 24064 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  12   LWP 24065 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  13   LWP 24066 "MaintenanceMgr " 0x00007fce8e73dad3 in ?? ()
  14   LWP 24067 "txn-status-mana" 0x00007fce8e73dfb9 in ?? ()
  15   LWP 24069 "collect_and_rem" 0x00007fce8e73dfb9 in ?? ()
  16   LWP 24070 "tc-session-exp-" 0x00007fce8e73dfb9 in ?? ()
  17   LWP 24071 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  18   LWP 24072 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  19   LWP 24073 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  20   LWP 24074 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  21   LWP 24075 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  22   LWP 24076 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  23   LWP 24077 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  24   LWP 24078 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  25   LWP 24079 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  26   LWP 24080 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  27   LWP 24081 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  28   LWP 24082 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  29   LWP 24083 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  30   LWP 24084 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  31   LWP 24085 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  32   LWP 24086 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  33   LWP 24087 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  34   LWP 24088 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  35   LWP 24089 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  36   LWP 24090 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  37   LWP 24091 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  38   LWP 24092 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  39   LWP 24093 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  40   LWP 24094 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  41   LWP 24095 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  42   LWP 24096 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  43   LWP 24097 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  44   LWP 24098 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  45   LWP 24099 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  46   LWP 24100 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  47   LWP 24101 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  48   LWP 24102 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  49   LWP 24103 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  50   LWP 24104 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  51   LWP 24105 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  52   LWP 24106 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  53   LWP 24107 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  54   LWP 24108 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  55   LWP 24109 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  56   LWP 24110 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  57   LWP 24111 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  58   LWP 24112 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  59   LWP 24113 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  60   LWP 24114 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  61   LWP 24115 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  62   LWP 24116 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  63   LWP 24117 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  64   LWP 24118 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  65   LWP 24119 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  66   LWP 24120 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  67   LWP 24121 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  68   LWP 24122 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  69   LWP 24123 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  70   LWP 24124 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  71   LWP 24125 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  72   LWP 24126 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  73   LWP 24127 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  74   LWP 24128 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  75   LWP 24129 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  76   LWP 24130 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  77   LWP 24131 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  78   LWP 24132 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  79   LWP 24133 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  80   LWP 24134 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  81   LWP 24135 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  82   LWP 24136 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  83   LWP 24137 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  84   LWP 24138 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  85   LWP 24139 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  86   LWP 24140 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  87   LWP 24141 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  88   LWP 24142 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  89   LWP 24143 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  90   LWP 24144 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  91   LWP 24145 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  92   LWP 24146 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  93   LWP 24147 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  94   LWP 24148 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  95   LWP 24149 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  96   LWP 24150 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  97   LWP 24151 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  98   LWP 24152 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  99   LWP 24153 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  100  LWP 24154 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  101  LWP 24155 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  102  LWP 24156 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  103  LWP 24157 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  104  LWP 24158 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  105  LWP 24159 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  106  LWP 24160 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  107  LWP 24161 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  108  LWP 24162 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  109  LWP 24163 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  110  LWP 24164 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  111  LWP 24165 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  112  LWP 24166 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  113  LWP 24167 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  114  LWP 24168 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  115  LWP 24169 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  116  LWP 24170 "rpc worker-2417" 0x00007fce8e73dad3 in ?? ()
  117  LWP 24171 "diag-logger-241" 0x00007fce8e73dfb9 in ?? ()
  118  LWP 24172 "result-tracker-" 0x00007fce8e73dfb9 in ?? ()
  119  LWP 24173 "excess-log-dele" 0x00007fce8e73dfb9 in ?? ()
  120  LWP 24174 "tcmalloc-memory" 0x00007fce8e73dfb9 in ?? ()
  121  LWP 24175 "acceptor-24175" 0x00007fce8c8610c7 in ?? ()
  122  LWP 24176 "heartbeat-24176" 0x00007fce8e73dfb9 in ?? ()
  123  LWP 24177 "maintenance_sch" 0x00007fce8e73dfb9 in ?? ()

Thread 123 (LWP 24177):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af22639e50 in ?? ()
#5  0x00007fce44fdf470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24176):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af225a3930 in ?? ()
#5  0x00007fce457e03f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 24175):
#0  0x00007fce8c8610c7 in ?? ()
#1  0x00007fce45fe1020 in ?? ()
#2  0x00007fce8e3c1c02 in ?? ()
#3  0x00007fce45fe1020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fce45fe13e0 in ?? ()
#6  0x00007fce45fe1090 in ?? ()
#7  0x000055af2255f0f8 in ?? ()
#8  0x00007fce8e3c7699 in ?? ()
#9  0x00007fce45fe1510 in ?? ()
#10 0x00007fce45fe1700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fce8e7413a7 in ?? ()
#13 0x00007fce45fe2520 in ?? ()
#14 0x00007fce45fe1260 in ?? ()
#15 0x000055af225ff0c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24174):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd18965c00 in ?? ()
#5  0x00007fce467e2670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24173):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24172):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224d7b70 in ?? ()
#5  0x00007fce477e4680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24171):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af22853390 in ?? ()
#5  0x00007fce47fe5550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24170):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 24169):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 24168):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24167):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24166):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24165):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24164):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24163):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24162):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24161):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24160):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24159):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24158):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24157):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24156):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24155):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24154):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24153):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24152):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000008 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af22855738 in ?? ()
#4  0x00007fce517f85d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce517f85f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 97 (LWP 24151):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24150):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24149):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24148):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24147):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24146):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24145):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24144):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24143):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24142):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24141):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24140):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24139):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24138):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24137):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24136):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 24135):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 24134):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 24133):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 24132):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 24131):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 24130):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000324 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af2281f138 in ?? ()
#4  0x00007fce5c80e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce5c80e5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 24129):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x000000000000023b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055af2281f0bc in ?? ()
#4  0x00007fce5d00f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce5d00f5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055af2281f0a8 in ?? ()
#9  0x00007fce8e73d770 in ?? ()
#10 0x00007fce5d00f5f0 in ?? ()
#11 0x00007fce5d00f650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 74 (LWP 24128):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 24127):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 24126):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 24125):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 24124):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 24123):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 24122):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 24121):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 24120):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 24119):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 24118):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 24117):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 24116):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 24115):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 24114):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 24113):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 24112):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 24111):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 24110):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af2281e638 in ?? ()
#4  0x00007fce668225d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce668225f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 24109):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 24108):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 24107):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 24106):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 24105):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 24104):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 24103):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 24102):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 24101):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 24100):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 24099):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 24098):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 24097):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 24096):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 24095):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 24094):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 24093):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 24092):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 24091):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 24090):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055af22741b3c in ?? ()
#4  0x00007fce708365d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce708365f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055af22741b28 in ?? ()
#9  0x00007fce8e73d770 in ?? ()
#10 0x00007fce708365f0 in ?? ()
#11 0x00007fce70836650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 24089):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 24088):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 24087):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 24086):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000034 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af22741a38 in ?? ()
#4  0x00007fce7283a5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce7283a5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 31 (LWP 24085):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000025 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055af22741abc in ?? ()
#4  0x00007fce7303b5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce7303b5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055af22741aa8 in ?? ()
#9  0x00007fce8e73d770 in ?? ()
#10 0x00007fce7303b5f0 in ?? ()
#11 0x00007fce7303b650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 30 (LWP 24084):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 24083):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 24082):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 24081):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 24080):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 24079):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 24078):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 24077):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 24076):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 24075):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 24074):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 24073):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 24072):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 24071):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 24070):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 24069):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bd6c8 in ?? ()
#5  0x00007fce7b04b6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 24067):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 24066):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 24065):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7d04f680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b4e58 in ?? ()
#5  0x00007fce7d04f6c0 in ?? ()
#6  0x00007fce7d04f840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fb958e7fce54000 in ?? ()
#10 0x000055af225a6c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a6c00 in ?? ()
#13 0x00000000225b4e58 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688d in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7d04f720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb958e7fce54000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 24064):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7d850680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b5a98 in ?? ()
#5  0x00007fce7d8506c0 in ?? ()
#6  0x00007fce7d850840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fa8dc2cbc0f8000 in ?? ()
#10 0x000055af225a6100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a6100 in ?? ()
#13 0x00000000225b5a98 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7d850720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa8dc2cbc0f8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 24063):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7e051680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b5c58 in ?? ()
#5  0x00007fce7e0516c0 in ?? ()
#6  0x00007fce7e051840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fb97fc24b13c000 in ?? ()
#10 0x000055af225a5600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a5600 in ?? ()
#13 0x00000000225b5c58 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7e051720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97fc24b13c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 24062):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7fe41680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b5e18 in ?? ()
#5  0x00007fce7fe416c0 in ?? ()
#6  0x00007fce7fe41840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fb9777166164000 in ?? ()
#10 0x000055af225a5b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a5b80 in ?? ()
#13 0x00000000225b5e18 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7fe41720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 24059):
#0  0x00007fce8c852cb9 in ?? ()
#1  0x00007fce81644840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 24058):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 24057):
#0  0x00007fce8e7419e2 in ?? ()
#1  0x000055af224d7ee0 in ?? ()
#2  0x00007fce806424d0 in ?? ()
#3  0x00007fce80642450 in ?? ()
#4  0x00007fce80642570 in ?? ()
#5  0x00007fce80642790 in ?? ()
#6  0x00007fce806427a0 in ?? ()
#7  0x00007fce806424e0 in ?? ()
#8  0x00007fce806424d0 in ?? ()
#9  0x000055af224d6350 in ?? ()
#10 0x00007fce8eb2cc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 24051):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af2265cdc8 in ?? ()
#5  0x00007fce82646430 in ?? ()
#6  0x0000000000000054 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 24050):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bc848 in ?? ()
#5  0x00007fce82e47790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 24049):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bc2a8 in ?? ()
#5  0x00007fce83648790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 24048):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bc188 in ?? ()
#5  0x00007fce83e49790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 24045):
#0  0x00007fce8e741d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:04.599313 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID 773ff64ed1b249db9be71c247d7cbf43 and pid 23911
************************ BEGIN STACKS **************************
[New LWP 23912]
[New LWP 23913]
[New LWP 23914]
[New LWP 23915]
[New LWP 23921]
[New LWP 23922]
[New LWP 23923]
[New LWP 23926]
[New LWP 23927]
[New LWP 23928]
[New LWP 23929]
[New LWP 23930]
[New LWP 23931]
[New LWP 23933]
[New LWP 23934]
[New LWP 23935]
[New LWP 23936]
[New LWP 23937]
[New LWP 23938]
[New LWP 23939]
[New LWP 23940]
[New LWP 23941]
[New LWP 23942]
[New LWP 23943]
[New LWP 23944]
[New LWP 23945]
[New LWP 23946]
[New LWP 23947]
[New LWP 23948]
[New LWP 23949]
[New LWP 23950]
[New LWP 23951]
[New LWP 23952]
[New LWP 23953]
[New LWP 23954]
[New LWP 23955]
[New LWP 23956]
[New LWP 23957]
[New LWP 23958]
[New LWP 23959]
[New LWP 23960]
[New LWP 23961]
[New LWP 23962]
[New LWP 23963]
[New LWP 23964]
[New LWP 23965]
[New LWP 23966]
[New LWP 23967]
[New LWP 23968]
[New LWP 23969]
[New LWP 23970]
[New LWP 23971]
[New LWP 23972]
[New LWP 23973]
[New LWP 23974]
[New LWP 23975]
[New LWP 23976]
[New LWP 23977]
[New LWP 23978]
[New LWP 23979]
[New LWP 23980]
[New LWP 23981]
[New LWP 23982]
[New LWP 23983]
[New LWP 23984]
[New LWP 23985]
[New LWP 23986]
[New LWP 23987]
[New LWP 23988]
[New LWP 23989]
[New LWP 23990]
[New LWP 23991]
[New LWP 23992]
[New LWP 23993]
[New LWP 23994]
[New LWP 23995]
[New LWP 23996]
[New LWP 23997]
[New LWP 23998]
[New LWP 23999]
[New LWP 24000]
[New LWP 24001]
[New LWP 24002]
[New LWP 24003]
[New LWP 24004]
[New LWP 24005]
[New LWP 24006]
[New LWP 24007]
[New LWP 24008]
[New LWP 24009]
[New LWP 24010]
[New LWP 24011]
[New LWP 24012]
[New LWP 24013]
[New LWP 24014]
[New LWP 24015]
[New LWP 24016]
[New LWP 24017]
[New LWP 24018]
[New LWP 24019]
[New LWP 24020]
[New LWP 24021]
[New LWP 24022]
[New LWP 24023]
[New LWP 24024]
[New LWP 24025]
[New LWP 24026]
[New LWP 24027]
[New LWP 24028]
[New LWP 24029]
[New LWP 24030]
[New LWP 24031]
[New LWP 24032]
[New LWP 24033]
[New LWP 24034]
[New LWP 24035]
[New LWP 24036]
[New LWP 24037]
[New LWP 24038]
[New LWP 24039]
[New LWP 24040]
[New LWP 24041]
[New LWP 24611]
0x00007f006999cd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 23911 "kudu"  0x00007f006999cd50 in ?? ()
  2    LWP 23912 "kudu"  0x00007f0069998fb9 in ?? ()
  3    LWP 23913 "kudu"  0x00007f0069998fb9 in ?? ()
  4    LWP 23914 "kudu"  0x00007f0069998fb9 in ?? ()
  5    LWP 23915 "kernel-watcher-" 0x00007f0069998fb9 in ?? ()
  6    LWP 23921 "ntp client-2392" 0x00007f006999c9e2 in ?? ()
  7    LWP 23922 "file cache-evic" 0x00007f0069998fb9 in ?? ()
  8    LWP 23923 "sq_acceptor" 0x00007f0067aadcb9 in ?? ()
  9    LWP 23926 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  10   LWP 23927 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  11   LWP 23928 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  12   LWP 23929 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  13   LWP 23930 "MaintenanceMgr " 0x00007f0069998ad3 in ?? ()
  14   LWP 23931 "txn-status-mana" 0x00007f0069998fb9 in ?? ()
  15   LWP 23933 "collect_and_rem" 0x00007f0069998fb9 in ?? ()
  16   LWP 23934 "tc-session-exp-" 0x00007f0069998fb9 in ?? ()
  17   LWP 23935 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  18   LWP 23936 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  19   LWP 23937 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  20   LWP 23938 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  21   LWP 23939 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  22   LWP 23940 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  23   LWP 23941 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  24   LWP 23942 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  25   LWP 23943 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  26   LWP 23944 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  27   LWP 23945 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  28   LWP 23946 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  29   LWP 23947 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  30   LWP 23948 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  31   LWP 23949 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  32   LWP 23950 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  33   LWP 23951 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  34   LWP 23952 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  35   LWP 23953 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  36   LWP 23954 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  37   LWP 23955 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  38   LWP 23956 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  39   LWP 23957 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  40   LWP 23958 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  41   LWP 23959 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  42   LWP 23960 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  43   LWP 23961 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  44   LWP 23962 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  45   LWP 23963 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  46   LWP 23964 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  47   LWP 23965 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  48   LWP 23966 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  49   LWP 23967 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  50   LWP 23968 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  51   LWP 23969 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  52   LWP 23970 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  53   LWP 23971 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  54   LWP 23972 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  55   LWP 23973 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  56   LWP 23974 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  57   LWP 23975 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  58   LWP 23976 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  59   LWP 23977 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  60   LWP 23978 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  61   LWP 23979 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  62   LWP 23980 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  63   LWP 23981 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  64   LWP 23982 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  65   LWP 23983 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  66   LWP 23984 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  67   LWP 23985 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  68   LWP 23986 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  69   LWP 23987 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  70   LWP 23988 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  71   LWP 23989 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  72   LWP 23990 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  73   LWP 23991 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  74   LWP 23992 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  75   LWP 23993 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  76   LWP 23994 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  77   LWP 23995 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  78   LWP 23996 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  79   LWP 23997 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  80   LWP 23998 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  81   LWP 23999 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  82   LWP 24000 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  83   LWP 24001 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  84   LWP 24002 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  85   LWP 24003 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  86   LWP 24004 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  87   LWP 24005 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  88   LWP 24006 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  89   LWP 24007 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  90   LWP 24008 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  91   LWP 24009 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  92   LWP 24010 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  93   LWP 24011 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  94   LWP 24012 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  95   LWP 24013 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  96   LWP 24014 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  97   LWP 24015 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  98   LWP 24016 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  99   LWP 24017 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  100  LWP 24018 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  101  LWP 24019 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  102  LWP 24020 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  103  LWP 24021 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  104  LWP 24022 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  105  LWP 24023 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  106  LWP 24024 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  107  LWP 24025 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  108  LWP 24026 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  109  LWP 24027 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  110  LWP 24028 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  111  LWP 24029 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  112  LWP 24030 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  113  LWP 24031 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  114  LWP 24032 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  115  LWP 24033 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  116  LWP 24034 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  117  LWP 24035 "diag-logger-240" 0x00007f0069998fb9 in ?? ()
  118  LWP 24036 "result-tracker-" 0x00007f0069998fb9 in ?? ()
  119  LWP 24037 "excess-log-dele" 0x00007f0069998fb9 in ?? ()
  120  LWP 24038 "tcmalloc-memory" 0x00007f0069998fb9 in ?? ()
  121  LWP 24039 "acceptor-24039" 0x00007f0067abc0c7 in ?? ()
  122  LWP 24040 "heartbeat-24040" 0x00007f0069998fb9 in ?? ()
  123  LWP 24041 "maintenance_sch" 0x00007f0069998fb9 in ?? ()
  124  LWP 24611 "raft [worker]-2" 0x00007f0069998fb9 in ?? ()

Thread 124 (LWP 24611):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000009b in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007f001f43d764 in ?? ()
#5  0x00007f001f43d510 in ?? ()
#6  0x0000000000000137 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f001f43d530 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f001f43d590 in ?? ()
#12 0x00007f006960c2e1 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 123 (LWP 24041):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae833fbe50 in ?? ()
#5  0x00007f002043f470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24040):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae83365930 in ?? ()
#5  0x00007f0020c403f0 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 24039):
#0  0x00007f0067abc0c7 in ?? ()
#1  0x00007f0021441020 in ?? ()
#2  0x00007f006961cc02 in ?? ()
#3  0x00007f0021441020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f00214413e0 in ?? ()
#6  0x00007f0021441090 in ?? ()
#7  0x000055ae833210f8 in ?? ()
#8  0x00007f0069622699 in ?? ()
#9  0x00007f0021441510 in ?? ()
#10 0x00007f0021441700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f006999c3a7 in ?? ()
#13 0x00007f0021442520 in ?? ()
#14 0x00007f0021441260 in ?? ()
#15 0x000055ae833c10c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24038):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd0726bb00 in ?? ()
#5  0x00007f0021c42670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24037):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24036):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae83299b70 in ?? ()
#5  0x00007f0022c44680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24035):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae83615390 in ?? ()
#5  0x00007f0023445550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24034):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000008 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae83616738 in ?? ()
#4  0x00007f0023c465d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0023c465f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 115 (LWP 24033):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 24032):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24031):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24030):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24029):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24028):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24027):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24026):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24025):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24024):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24023):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24022):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24021):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24020):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24019):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24018):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24017):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24016):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 24015):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24014):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24013):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24012):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24011):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24010):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24009):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24008):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24007):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24006):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24005):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24004):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24003):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24002):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24001):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24000):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 23999):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 23998):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 23997):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 23996):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 23995):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 23994):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae835e1138 in ?? ()
#4  0x00007f0037c6e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0037c6e5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 23993):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 23992):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 23991):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 23990):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 23989):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 23988):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 23987):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 23986):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 23985):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 23984):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 23983):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 23982):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 23981):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 23980):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 23979):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 23978):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 23977):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 23976):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 23975):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 23974):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae835e0638 in ?? ()
#4  0x00007f0041c825d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0041c825f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 23973):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 23972):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 23971):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 23970):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 23969):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 23968):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 23967):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 23966):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 23965):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 23964):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 23963):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 23962):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 23961):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 23960):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 23959):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 23958):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 23957):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 23956):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 23955):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 23954):
#0  0x00007f0069998ad3 in ?? ()
#1  0x000000000000023d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055ae83503b3c in ?? ()
#4  0x00007f004bc965d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f004bc965f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055ae83503b28 in ?? ()
#9  0x00007f0069998770 in ?? ()
#10 0x00007f004bc965f0 in ?? ()
#11 0x00007f004bc96650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 23953):
#0  0x00007f0069998ad3 in ?? ()
#1  0x000000000000013e in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae83503ab8 in ?? ()
#4  0x00007f004c4975d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f004c4975f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 34 (LWP 23952):
#0  0x00007f0069998ad3 in ?? ()
#1  0x00000000000001cc in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae83503a38 in ?? ()
#4  0x00007f004cc985d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f004cc985f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 33 (LWP 23951):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 23950):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 23949):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 23948):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 23947):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 23946):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 23945):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 23944):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 23943):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 23942):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 23941):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 23940):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 23939):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 23938):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 23937):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 23936):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 23935):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 23934):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 23933):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327f6c8 in ?? ()
#5  0x00007f00564ab6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 23931):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 23930):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 23929):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f00584af680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83376e58 in ?? ()
#5  0x00007f00584af6c0 in ?? ()
#6  0x00007f00584af840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fb307dbec7a8000 in ?? ()
#10 0x000055ae83368c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83368c00 in ?? ()
#13 0x0000000083376e58 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e246888 in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f00584af720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb307dbec7a8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 23928):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f0058cb0680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83377a98 in ?? ()
#5  0x00007f0058cb06c0 in ?? ()
#6  0x00007f0058cb0840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fb9756cad630000 in ?? ()
#10 0x000055ae83368100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83368100 in ?? ()
#13 0x0000000083377a98 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f0058cb0720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9756cad630000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 23927):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f00594b1680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83377c58 in ?? ()
#5  0x00007f00594b16c0 in ?? ()
#6  0x00007f00594b1840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fa5a86005e80000 in ?? ()
#10 0x000055ae83367b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83367b80 in ?? ()
#13 0x0000000083377c58 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f00594b1720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa5a86005e80000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 23926):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f005b09c680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83377e18 in ?? ()
#5  0x00007f005b09c6c0 in ?? ()
#6  0x00007f005b09c840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fa552038b710000 in ?? ()
#10 0x000055ae83367600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83367600 in ?? ()
#13 0x0000000083377e18 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f005b09c720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 23923):
#0  0x00007f0067aadcb9 in ?? ()
#1  0x00007f005c89f840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 23922):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 23921):
#0  0x00007f006999c9e2 in ?? ()
#1  0x000055ae83299ee0 in ?? ()
#2  0x00007f005b89d4d0 in ?? ()
#3  0x00007f005b89d450 in ?? ()
#4  0x00007f005b89d570 in ?? ()
#5  0x00007f005b89d790 in ?? ()
#6  0x00007f005b89d7a0 in ?? ()
#7  0x00007f005b89d4e0 in ?? ()
#8  0x00007f005b89d4d0 in ?? ()
#9  0x000055ae83298350 in ?? ()
#10 0x00007f0069d87c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 23915):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8341edc8 in ?? ()
#5  0x00007f005d8a1430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 23914):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327e848 in ?? ()
#5  0x00007f005e0a2790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 23913):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327e2a8 in ?? ()
#5  0x00007f005e8a3790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 23912):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327e188 in ?? ()
#5  0x00007f005f0a4790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 23911):
#0  0x00007f006999cd50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:05.088961 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID e9ac8f0e11a34e5fb1c19a793f211a56 and pid 24311
************************ BEGIN STACKS **************************
[New LWP 24313]
[New LWP 24314]
[New LWP 24315]
[New LWP 24316]
[New LWP 24322]
[New LWP 24323]
[New LWP 24324]
[New LWP 24327]
[New LWP 24328]
[New LWP 24329]
[New LWP 24330]
[New LWP 24331]
[New LWP 24332]
[New LWP 24334]
[New LWP 24335]
[New LWP 24336]
[New LWP 24337]
[New LWP 24338]
[New LWP 24339]
[New LWP 24340]
[New LWP 24341]
[New LWP 24342]
[New LWP 24343]
[New LWP 24344]
[New LWP 24345]
[New LWP 24346]
[New LWP 24347]
[New LWP 24348]
[New LWP 24349]
[New LWP 24350]
[New LWP 24351]
[New LWP 24352]
[New LWP 24353]
[New LWP 24354]
[New LWP 24355]
[New LWP 24356]
[New LWP 24357]
[New LWP 24358]
[New LWP 24359]
[New LWP 24360]
[New LWP 24361]
[New LWP 24362]
[New LWP 24363]
[New LWP 24364]
[New LWP 24365]
[New LWP 24366]
[New LWP 24367]
[New LWP 24368]
[New LWP 24369]
[New LWP 24370]
[New LWP 24371]
[New LWP 24372]
[New LWP 24373]
[New LWP 24374]
[New LWP 24375]
[New LWP 24376]
[New LWP 24377]
[New LWP 24378]
[New LWP 24379]
[New LWP 24380]
[New LWP 24381]
[New LWP 24382]
[New LWP 24383]
[New LWP 24384]
[New LWP 24385]
[New LWP 24386]
[New LWP 24387]
[New LWP 24388]
[New LWP 24389]
[New LWP 24390]
[New LWP 24391]
[New LWP 24392]
[New LWP 24393]
[New LWP 24394]
[New LWP 24395]
[New LWP 24396]
[New LWP 24397]
[New LWP 24398]
[New LWP 24399]
[New LWP 24400]
[New LWP 24401]
[New LWP 24402]
[New LWP 24403]
[New LWP 24404]
[New LWP 24405]
[New LWP 24406]
[New LWP 24407]
[New LWP 24408]
[New LWP 24409]
[New LWP 24410]
[New LWP 24411]
[New LWP 24412]
[New LWP 24413]
[New LWP 24414]
[New LWP 24415]
[New LWP 24416]
[New LWP 24417]
[New LWP 24418]
[New LWP 24419]
[New LWP 24420]
[New LWP 24421]
[New LWP 24422]
[New LWP 24423]
[New LWP 24424]
[New LWP 24425]
[New LWP 24426]
[New LWP 24427]
[New LWP 24428]
[New LWP 24429]
[New LWP 24430]
[New LWP 24431]
[New LWP 24432]
[New LWP 24433]
[New LWP 24434]
[New LWP 24435]
[New LWP 24436]
[New LWP 24437]
[New LWP 24438]
[New LWP 24439]
[New LWP 24440]
[New LWP 24441]
[New LWP 24442]
0x00007fc5186b3d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 24311 "kudu"  0x00007fc5186b3d50 in ?? ()
  2    LWP 24313 "kudu"  0x00007fc5186affb9 in ?? ()
  3    LWP 24314 "kudu"  0x00007fc5186affb9 in ?? ()
  4    LWP 24315 "kudu"  0x00007fc5186affb9 in ?? ()
  5    LWP 24316 "kernel-watcher-" 0x00007fc5186affb9 in ?? ()
  6    LWP 24322 "ntp client-2432" 0x00007fc5186b39e2 in ?? ()
  7    LWP 24323 "file cache-evic" 0x00007fc5186affb9 in ?? ()
  8    LWP 24324 "sq_acceptor" 0x00007fc5167c4cb9 in ?? ()
  9    LWP 24327 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  10   LWP 24328 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  11   LWP 24329 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  12   LWP 24330 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  13   LWP 24331 "MaintenanceMgr " 0x00007fc5186afad3 in ?? ()
  14   LWP 24332 "txn-status-mana" 0x00007fc5186affb9 in ?? ()
  15   LWP 24334 "collect_and_rem" 0x00007fc5186affb9 in ?? ()
  16   LWP 24335 "tc-session-exp-" 0x00007fc5186affb9 in ?? ()
  17   LWP 24336 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  18   LWP 24337 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  19   LWP 24338 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  20   LWP 24339 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  21   LWP 24340 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  22   LWP 24341 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  23   LWP 24342 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  24   LWP 24343 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  25   LWP 24344 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  26   LWP 24345 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  27   LWP 24346 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  28   LWP 24347 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  29   LWP 24348 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  30   LWP 24349 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  31   LWP 24350 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  32   LWP 24351 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  33   LWP 24352 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  34   LWP 24353 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  35   LWP 24354 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  36   LWP 24355 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  37   LWP 24356 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  38   LWP 24357 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  39   LWP 24358 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  40   LWP 24359 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  41   LWP 24360 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  42   LWP 24361 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  43   LWP 24362 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  44   LWP 24363 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  45   LWP 24364 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  46   LWP 24365 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  47   LWP 24366 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  48   LWP 24367 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  49   LWP 24368 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  50   LWP 24369 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  51   LWP 24370 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  52   LWP 24371 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  53   LWP 24372 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  54   LWP 24373 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  55   LWP 24374 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  56   LWP 24375 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  57   LWP 24376 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  58   LWP 24377 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  59   LWP 24378 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  60   LWP 24379 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  61   LWP 24380 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  62   LWP 24381 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  63   LWP 24382 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  64   LWP 24383 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  65   LWP 24384 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  66   LWP 24385 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  67   LWP 24386 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  68   LWP 24387 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  69   LWP 24388 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  70   LWP 24389 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  71   LWP 24390 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  72   LWP 24391 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  73   LWP 24392 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  74   LWP 24393 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  75   LWP 24394 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  76   LWP 24395 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  77   LWP 24396 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  78   LWP 24397 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  79   LWP 24398 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  80   LWP 24399 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  81   LWP 24400 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  82   LWP 24401 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  83   LWP 24402 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  84   LWP 24403 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  85   LWP 24404 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  86   LWP 24405 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  87   LWP 24406 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  88   LWP 24407 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  89   LWP 24408 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  90   LWP 24409 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  91   LWP 24410 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  92   LWP 24411 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  93   LWP 24412 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  94   LWP 24413 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  95   LWP 24414 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  96   LWP 24415 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  97   LWP 24416 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  98   LWP 24417 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  99   LWP 24418 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  100  LWP 24419 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  101  LWP 24420 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  102  LWP 24421 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  103  LWP 24422 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  104  LWP 24423 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  105  LWP 24424 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  106  LWP 24425 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  107  LWP 24426 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  108  LWP 24427 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  109  LWP 24428 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  110  LWP 24429 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  111  LWP 24430 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  112  LWP 24431 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  113  LWP 24432 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  114  LWP 24433 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  115  LWP 24434 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  116  LWP 24435 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  117  LWP 24436 "diag-logger-244" 0x00007fc5186affb9 in ?? ()
  118  LWP 24437 "result-tracker-" 0x00007fc5186affb9 in ?? ()
  119  LWP 24438 "excess-log-dele" 0x00007fc5186affb9 in ?? ()
  120  LWP 24439 "tcmalloc-memory" 0x00007fc5186affb9 in ?? ()
  121  LWP 24440 "acceptor-24440" 0x00007fc5167d30c7 in ?? ()
  122  LWP 24441 "heartbeat-24441" 0x00007fc5186affb9 in ?? ()
  123  LWP 24442 "maintenance_sch" 0x00007fc5186affb9 in ?? ()

Thread 123 (LWP 24442):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6efbbe50 in ?? ()
#5  0x00007fc4cf156470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24441):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055be6ef25934 in ?? ()
#5  0x00007fc4cf9573f0 in ?? ()
#6  0x0000000000000017 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fc4cf957410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fc4cf957470 in ?? ()
#12 0x00007fc5183232e1 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 24440):
#0  0x00007fc5167d30c7 in ?? ()
#1  0x00007fc4d0158020 in ?? ()
#2  0x00007fc518333c02 in ?? ()
#3  0x00007fc4d0158020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fc4d01583e0 in ?? ()
#6  0x00007fc4d0158090 in ?? ()
#7  0x000055be6eee10f8 in ?? ()
#8  0x00007fc518339699 in ?? ()
#9  0x00007fc4d0158510 in ?? ()
#10 0x00007fc4d0158700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fc5186b33a7 in ?? ()
#13 0x00007fc4d0159520 in ?? ()
#14 0x00007fc4d0158260 in ?? ()
#15 0x000055be6ef810c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24439):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc68d41cc0 in ?? ()
#5  0x00007fc4d0959670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24438):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24437):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee59b70 in ?? ()
#5  0x00007fc4d195b680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24436):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6f1d0790 in ?? ()
#5  0x00007fc4d215c550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24435):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6f1d933c in ?? ()
#4  0x00007fc4d295d5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4d295d5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6f1d9328 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc4d295d5f0 in ?? ()
#11 0x00007fc4d295d650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 24434):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6f1d92bc in ?? ()
#4  0x00007fc4d315e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4d315e5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6f1d92a8 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc4d315e5f0 in ?? ()
#11 0x00007fc4d315e650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 24433):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24432):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24431):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24430):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24429):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24428):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24427):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24426):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24425):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24424):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24423):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24422):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24421):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24420):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24419):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24418):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24417):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 24416):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24415):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24414):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24413):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24412):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24411):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24410):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24409):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24408):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24407):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24406):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24405):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24404):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24403):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24402):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24401):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 24400):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 24399):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 24398):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 24397):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 24396):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 24395):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 24394):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 24393):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 24392):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 24391):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 24390):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 24389):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 24388):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 24387):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 24386):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 24385):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 24384):
#0  0x00007fc5186afad3 in ?? ()
#1  0x00000000000004a8 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d8f38 in ?? ()
#4  0x00007fc4ec1905d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4ec1905f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 64 (LWP 24383):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 24382):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 24381):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 24380):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 24379):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 24378):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 24377):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 24376):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 24375):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 24374):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 24373):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 24372):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 24371):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 24370):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 24369):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 24368):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 24367):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 24366):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 24365):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 24364):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 24363):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 24362):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 24361):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 24360):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 24359):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 24358):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d9038 in ?? ()
#4  0x00007fc4f91aa5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4f91aa5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 38 (LWP 24357):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 24356):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 24355):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 24354):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 24353):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 24352):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 24351):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 24350):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 24349):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 24348):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 24347):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 24346):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 24345):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 24344):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 24343):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 24342):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 24341):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 24340):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 24339):
#0  0x00007fc5186afad3 in ?? ()
#1  0x000000000000178c in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d9638 in ?? ()
#4  0x00007fc5029bd5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5029bd5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 19 (LWP 24338):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000001c70 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d96b8 in ?? ()
#4  0x00007fc5031be5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5031be5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 18 (LWP 24337):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000001e31 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6f1d973c in ?? ()
#4  0x00007fc5039bf5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5039bf5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6f1d9728 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc5039bf5f0 in ?? ()
#11 0x00007fc5039bf650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 17 (LWP 24336):
#0  0x00007fc5186afad3 in ?? ()
#1  0x000000000000046d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6eecdcbc in ?? ()
#4  0x00007fc5041c05d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5041c05f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6eecdca8 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc5041c05f0 in ?? ()
#11 0x00007fc5041c0650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 16 (LWP 24335):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 24334):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3f6c8 in ?? ()
#5  0x00007fc5051c26a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 24332):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 24331):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 24330):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc5071c6680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef36e58 in ?? ()
#5  0x00007fc5071c66c0 in ?? ()
#6  0x00007fc5071c6840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fad57813d928000 in ?? ()
#10 0x000055be6ef28c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef28c00 in ?? ()
#13 0x000000006ef36e58 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc5071c6720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 11 (LWP 24329):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc5079c7680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef37a98 in ?? ()
#5  0x00007fc5079c76c0 in ?? ()
#6  0x00007fc5079c7840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fb989d0b0a08000 in ?? ()
#10 0x000055be6ef27600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef27600 in ?? ()
#13 0x000000006ef37a98 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e246889 in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc5079c7720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb989d0b0a08000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 24328):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc5081c8680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef37c58 in ?? ()
#5  0x00007fc5081c86c0 in ?? ()
#6  0x00007fc5081c8840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fb961f75ef14000 in ?? ()
#10 0x000055be6ef27b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef27b80 in ?? ()
#13 0x000000006ef37c58 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc5081c8720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb961f75ef14000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 24327):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc509db3680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef37e18 in ?? ()
#5  0x00007fc509db36c0 in ?? ()
#6  0x00007fc509db3840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fb95f1fb9ea4000 in ?? ()
#10 0x000055be6ef28680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef28680 in ?? ()
#13 0x000000006ef37e18 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc509db3720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95f1fb9ea4000 in ?? ()
#21 0x000000006e5c80a0 in ?? ()
#22 0x000055be6ef28680 in ?? ()
#23 0x00007fc509db3860 in ?? ()
#24 0x3fb95f1fb9ea4000 in ?? ()
#25 0x0000000000000000 in ?? ()

Thread 8 (LWP 24324):
#0  0x00007fc5167c4cb9 in ?? ()
#1  0x00007fc50b5b6840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 24323):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 24322):
#0  0x00007fc5186b39e2 in ?? ()
#1  0x000055be6ee59ee0 in ?? ()
#2  0x00007fc50a5b44d0 in ?? ()
#3  0x00007fc50a5b4450 in ?? ()
#4  0x00007fc50a5b4570 in ?? ()
#5  0x00007fc50a5b4790 in ?? ()
#6  0x00007fc50a5b47a0 in ?? ()
#7  0x00007fc50a5b44e0 in ?? ()
#8  0x00007fc50a5b44d0 in ?? ()
#9  0x000055be6ee58350 in ?? ()
#10 0x00007fc518a9ec6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 24316):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6efdedc8 in ?? ()
#5  0x00007fc50c5b8430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 24315):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3e848 in ?? ()
#5  0x00007fc50cdb9790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 24314):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3e2a8 in ?? ()
#5  0x00007fc50d5ba790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 24313):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3e188 in ?? ()
#5  0x00007fc50ddbb790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 24311):
#0  0x00007fc5186b3d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:05.587875 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID 36b0ebc9a5694a778497ec8d94aba993 and pid 24179
************************ BEGIN STACKS **************************
[New LWP 24181]
[New LWP 24182]
[New LWP 24183]
[New LWP 24184]
[New LWP 24190]
[New LWP 24191]
[New LWP 24192]
[New LWP 24195]
[New LWP 24196]
[New LWP 24197]
[New LWP 24198]
[New LWP 24199]
[New LWP 24200]
[New LWP 24201]
[New LWP 24202]
[New LWP 24203]
[New LWP 24204]
[New LWP 24205]
[New LWP 24206]
[New LWP 24207]
[New LWP 24208]
[New LWP 24209]
[New LWP 24210]
[New LWP 24211]
[New LWP 24212]
[New LWP 24213]
[New LWP 24214]
[New LWP 24215]
[New LWP 24216]
[New LWP 24217]
[New LWP 24218]
[New LWP 24219]
[New LWP 24220]
[New LWP 24221]
[New LWP 24222]
[New LWP 24223]
[New LWP 24224]
[New LWP 24225]
[New LWP 24226]
[New LWP 24227]
[New LWP 24228]
[New LWP 24229]
[New LWP 24230]
[New LWP 24231]
[New LWP 24232]
[New LWP 24233]
[New LWP 24234]
[New LWP 24235]
[New LWP 24236]
[New LWP 24237]
[New LWP 24238]
[New LWP 24239]
[New LWP 24240]
[New LWP 24241]
[New LWP 24242]
[New LWP 24243]
[New LWP 24244]
[New LWP 24245]
[New LWP 24246]
[New LWP 24247]
[New LWP 24248]
[New LWP 24249]
[New LWP 24250]
[New LWP 24251]
[New LWP 24252]
[New LWP 24253]
[New LWP 24254]
[New LWP 24255]
[New LWP 24256]
[New LWP 24257]
[New LWP 24258]
[New LWP 24259]
[New LWP 24260]
[New LWP 24261]
[New LWP 24262]
[New LWP 24263]
[New LWP 24264]
[New LWP 24265]
[New LWP 24266]
[New LWP 24267]
[New LWP 24268]
[New LWP 24269]
[New LWP 24270]
[New LWP 24271]
[New LWP 24272]
[New LWP 24273]
[New LWP 24274]
[New LWP 24275]
[New LWP 24276]
[New LWP 24277]
[New LWP 24278]
[New LWP 24279]
[New LWP 24280]
[New LWP 24281]
[New LWP 24282]
[New LWP 24283]
[New LWP 24284]
[New LWP 24285]
[New LWP 24286]
[New LWP 24287]
[New LWP 24288]
[New LWP 24289]
[New LWP 24290]
[New LWP 24291]
[New LWP 24292]
[New LWP 24293]
[New LWP 24294]
[New LWP 24295]
[New LWP 24296]
[New LWP 24297]
[New LWP 24298]
[New LWP 24299]
[New LWP 24300]
[New LWP 24301]
[New LWP 24302]
[New LWP 24303]
[New LWP 24304]
[New LWP 24305]
[New LWP 24306]
[New LWP 24307]
[New LWP 24308]
[New LWP 24309]
0x00007f5349e84d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 24179 "kudu"  0x00007f5349e84d50 in ?? ()
  2    LWP 24181 "kudu"  0x00007f5349e80fb9 in ?? ()
  3    LWP 24182 "kudu"  0x00007f5349e80fb9 in ?? ()
  4    LWP 24183 "kudu"  0x00007f5349e80fb9 in ?? ()
  5    LWP 24184 "kernel-watcher-" 0x00007f5349e80fb9 in ?? ()
  6    LWP 24190 "ntp client-2419" 0x00007f5349e849e2 in ?? ()
  7    LWP 24191 "file cache-evic" 0x00007f5349e80fb9 in ?? ()
  8    LWP 24192 "sq_acceptor" 0x00007f5347f95cb9 in ?? ()
  9    LWP 24195 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  10   LWP 24196 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  11   LWP 24197 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  12   LWP 24198 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  13   LWP 24199 "MaintenanceMgr " 0x00007f5349e80ad3 in ?? ()
  14   LWP 24200 "txn-status-mana" 0x00007f5349e80fb9 in ?? ()
  15   LWP 24201 "collect_and_rem" 0x00007f5349e80fb9 in ?? ()
  16   LWP 24202 "tc-session-exp-" 0x00007f5349e80fb9 in ?? ()
  17   LWP 24203 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  18   LWP 24204 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  19   LWP 24205 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  20   LWP 24206 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  21   LWP 24207 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  22   LWP 24208 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  23   LWP 24209 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  24   LWP 24210 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  25   LWP 24211 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  26   LWP 24212 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  27   LWP 24213 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  28   LWP 24214 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  29   LWP 24215 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  30   LWP 24216 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  31   LWP 24217 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  32   LWP 24218 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  33   LWP 24219 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  34   LWP 24220 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  35   LWP 24221 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  36   LWP 24222 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  37   LWP 24223 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  38   LWP 24224 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  39   LWP 24225 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  40   LWP 24226 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  41   LWP 24227 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  42   LWP 24228 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  43   LWP 24229 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  44   LWP 24230 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  45   LWP 24231 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  46   LWP 24232 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  47   LWP 24233 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  48   LWP 24234 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  49   LWP 24235 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  50   LWP 24236 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  51   LWP 24237 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  52   LWP 24238 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  53   LWP 24239 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  54   LWP 24240 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  55   LWP 24241 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  56   LWP 24242 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  57   LWP 24243 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  58   LWP 24244 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  59   LWP 24245 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  60   LWP 24246 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  61   LWP 24247 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  62   LWP 24248 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  63   LWP 24249 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  64   LWP 24250 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  65   LWP 24251 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  66   LWP 24252 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  67   LWP 24253 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  68   LWP 24254 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  69   LWP 24255 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  70   LWP 24256 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  71   LWP 24257 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  72   LWP 24258 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  73   LWP 24259 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  74   LWP 24260 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  75   LWP 24261 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  76   LWP 24262 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  77   LWP 24263 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  78   LWP 24264 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  79   LWP 24265 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  80   LWP 24266 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  81   LWP 24267 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  82   LWP 24268 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  83   LWP 24269 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  84   LWP 24270 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  85   LWP 24271 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  86   LWP 24272 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  87   LWP 24273 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  88   LWP 24274 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  89   LWP 24275 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  90   LWP 24276 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  91   LWP 24277 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  92   LWP 24278 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  93   LWP 24279 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  94   LWP 24280 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  95   LWP 24281 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  96   LWP 24282 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  97   LWP 24283 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  98   LWP 24284 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  99   LWP 24285 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  100  LWP 24286 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  101  LWP 24287 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  102  LWP 24288 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  103  LWP 24289 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  104  LWP 24290 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  105  LWP 24291 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  106  LWP 24292 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  107  LWP 24293 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  108  LWP 24294 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  109  LWP 24295 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  110  LWP 24296 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  111  LWP 24297 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  112  LWP 24298 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  113  LWP 24299 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  114  LWP 24300 "rpc worker-2430" 0x00007f5349e80ad3 in ?? ()
  115  LWP 24301 "rpc worker-2430" 0x00007f5349e80ad3 in ?? ()
  116  LWP 24302 "rpc worker-2430" 0x00007f5349e80ad3 in ?? ()
  117  LWP 24303 "diag-logger-243" 0x00007f5349e80fb9 in ?? ()
  118  LWP 24304 "result-tracker-" 0x00007f5349e80fb9 in ?? ()
  119  LWP 24305 "excess-log-dele" 0x00007f5349e80fb9 in ?? ()
  120  LWP 24306 "tcmalloc-memory" 0x00007f5349e80fb9 in ?? ()
  121  LWP 24307 "acceptor-24307" 0x00007f5347fa40c7 in ?? ()
  122  LWP 24308 "heartbeat-24308" 0x00007f5349e80fb9 in ?? ()
  123  LWP 24309 "maintenance_sch" 0x00007f5349e80fb9 in ?? ()

Thread 123 (LWP 24309):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000026 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a18050de50 in ?? ()
#5  0x00007f5301128470 in ?? ()
#6  0x000000000000004c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24308):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180477930 in ?? ()
#5  0x00007f53019293f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 24307):
#0  0x00007f5347fa40c7 in ?? ()
#1  0x00007f530212a020 in ?? ()
#2  0x00007f5349b04c02 in ?? ()
#3  0x00007f530212a020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f530212a3e0 in ?? ()
#6  0x00007f530212a090 in ?? ()
#7  0x000055a1804330f8 in ?? ()
#8  0x00007f5349b0a699 in ?? ()
#9  0x00007f530212a510 in ?? ()
#10 0x00007f530212a700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f5349e843a7 in ?? ()
#13 0x00007f530212b520 in ?? ()
#14 0x00007f530212a260 in ?? ()
#15 0x000055a1804d30c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24306):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd42b663a0 in ?? ()
#5  0x00007f530292b670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24305):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24304):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1803abb70 in ?? ()
#5  0x00007f530392d680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24303):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1806b4690 in ?? ()
#5  0x00007f530412e550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24302):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a180686ebc in ?? ()
#4  0x00007f530492f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f530492f5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a180686ea8 in ?? ()
#9  0x00007f5349e80770 in ?? ()
#10 0x00007f530492f5f0 in ?? ()
#11 0x00007f530492f650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 24301):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a180686e3c in ?? ()
#4  0x00007f53051305d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f53051305f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a180686e28 in ?? ()
#9  0x00007f5349e80770 in ?? ()
#10 0x00007f53051305f0 in ?? ()
#11 0x00007f5305130650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 24300):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24299):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24298):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24297):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24296):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24295):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24294):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24293):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24292):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24291):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24290):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24289):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24288):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24287):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24286):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24285):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24284):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 24283):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24282):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24281):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24280):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24279):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24278):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24277):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24276):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24275):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24274):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24273):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24272):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24271):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24270):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24269):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24268):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 24267):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 24266):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 24265):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 24264):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 24263):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 24262):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a1806778b8 in ?? ()
#4  0x00007f53189575d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f53189575f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 24261):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 24260):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 24259):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 24258):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 24257):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 24256):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 24255):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 24254):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 24253):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 24252):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 24251):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 24250):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 24249):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 24248):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 24247):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 24246):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 24245):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 24244):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 24243):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 24242):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a180676db8 in ?? ()
#4  0x00007f532296b5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f532296b5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 24241):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 24240):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 24239):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 24238):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 24237):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 24236):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 24235):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 24234):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 24233):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 24232):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 24231):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 24230):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 24229):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 24228):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 24227):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 24226):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 24225):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 24224):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 24223):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 24222):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a1806762b8 in ?? ()
#4  0x00007f532c97f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f532c97f5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 24221):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 24220):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 24219):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 24218):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 24217):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 24216):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 24215):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 24214):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 24213):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 24212):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 24211):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 24210):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 24209):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 24208):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 24207):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 24206):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 24205):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 24204):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 24203):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 24202):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 24201):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1803916c8 in ?? ()
#5  0x00007f53371946a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 24200):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 24199):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 24198):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f5338997680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180488e58 in ?? ()
#5  0x00007f53389976c0 in ?? ()
#6  0x00007f5338997840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb97c85bbd14000 in ?? ()
#10 0x000055a18047ac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a18047ac00 in ?? ()
#13 0x0000000080488e58 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f5338997720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97c85bbd14000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 24197):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f5339198680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180489a98 in ?? ()
#5  0x00007f53391986c0 in ?? ()
#6  0x00007f5339198840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb97be4ac218000 in ?? ()
#10 0x000055a180479600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a180479600 in ?? ()
#13 0x0000000080489a98 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f5339198720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97be4ac218000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 24196):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f5339999680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180489c58 in ?? ()
#5  0x00007f53399996c0 in ?? ()
#6  0x00007f5339999840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb9788b2377c000 in ?? ()
#10 0x000055a180479b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a180479b80 in ?? ()
#13 0x0000000080489c58 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f5339999720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9788b2377c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 24195):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f533b584680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180489e18 in ?? ()
#5  0x00007f533b5846c0 in ?? ()
#6  0x00007f533b584840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb972da1f084000 in ?? ()
#10 0x000055a18047a100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a18047a100 in ?? ()
#13 0x0000000080489e18 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f533b584720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 24192):
#0  0x00007f5347f95cb9 in ?? ()
#1  0x00007f533cd87840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 24191):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 24190):
#0  0x00007f5349e849e2 in ?? ()
#1  0x000055a1803abee0 in ?? ()
#2  0x00007f533bd854d0 in ?? ()
#3  0x00007f533bd85450 in ?? ()
#4  0x00007f533bd85570 in ?? ()
#5  0x00007f533bd85790 in ?? ()
#6  0x00007f533bd857a0 in ?? ()
#7  0x00007f533bd854e0 in ?? ()
#8  0x00007f533bd854d0 in ?? ()
#9  0x000055a1803aa350 in ?? ()
#10 0x00007f534a26fc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 24184):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000030 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180530dc8 in ?? ()
#5  0x00007f533dd89430 in ?? ()
#6  0x0000000000000060 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 24183):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180390848 in ?? ()
#5  0x00007f533e58a790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 24182):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1803902a8 in ?? ()
#5  0x00007f533ed8b790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 24181):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180390188 in ?? ()
#5  0x00007f533f58c790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 24179):
#0  0x00007f5349e84d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:06.072073 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 24045
I20251024 08:17:06.084118 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 23911
I20251024 08:17:06.096768 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 24311
I20251024 08:17:06.110723 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 24179
I20251024 08:17:06.116164 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19888
2025-10-24T08:17:06Z chronyd exiting
I20251024 08:17:06.133477 18753 test_util.cc:183] -----------------------------------------------
I20251024 08:17:06.133559 18753 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0

Full log

Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN      ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2025-10-24T08:16:03Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-10-24T08:16:03Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20251024 08:16:03.316375 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.18.80.126:40485
--webserver_interface=127.18.80.126
--webserver_port=0
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.18.80.126:40485 with env {}
W20251024 08:16:03.392380 18761 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:03.392565 18761 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:03.392581 18761 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:03.393913 18761 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251024 08:16:03.393951 18761 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:03.393962 18761 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251024 08:16:03.393972 18761 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251024 08:16:03.395475 18761 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.18.80.126:40485
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.18.80.126:40485
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.18.80.126
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.18761
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:03.395735 18761 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:03.395920 18761 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:03.398389 18766 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.398396 18767 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.398478 18769 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:03.398588 18761 server_base.cc:1047] running on GCE node
I20251024 08:16:03.398855 18761 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:03.399076 18761 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:03.400199 18761 hybrid_clock.cc:648] HybridClock initialized: now 1761293763400195 us; error 35 us; skew 500 ppm
I20251024 08:16:03.401436 18761 webserver.cc:492] Webserver started at http://127.18.80.126:35279/ using document root <none> and password file <none>
I20251024 08:16:03.401662 18761 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:03.401705 18761 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:03.401827 18761 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:03.402740 18761 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/instance:
uuid: "32f4918bb4bc4166aa635d60880ad40a"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.403024 18761 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal/instance:
uuid: "32f4918bb4bc4166aa635d60880ad40a"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.404268 18761 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.002s
I20251024 08:16:03.404996 18775 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.405190 18761 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.405265 18761 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
uuid: "32f4918bb4bc4166aa635d60880ad40a"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.405339 18761 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:03.418058 18761 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:03.418357 18761 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:03.418478 18761 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:03.422122 18761 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.126:40485
I20251024 08:16:03.422180 18827 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.126:40485 every 8 connection(s)
I20251024 08:16:03.422465 18761 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
I20251024 08:16:03.423084 18828 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.425321 18828 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Bootstrap starting.
I20251024 08:16:03.425895 18828 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.426184 18828 log.cc:826] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:03.426795 18828 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: No bootstrap required, opened a new log
I20251024 08:16:03.427928 18828 raft_consensus.cc:359] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:03.428048 18828 raft_consensus.cc:385] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.428071 18828 raft_consensus.cc:740] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 32f4918bb4bc4166aa635d60880ad40a, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.428200 18828 consensus_queue.cc:260] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:03.428257 18828 raft_consensus.cc:399] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251024 08:16:03.428278 18828 raft_consensus.cc:493] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251024 08:16:03.428306 18828 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.428814 18828 raft_consensus.cc:515] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:03.428951 18828 leader_election.cc:304] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 32f4918bb4bc4166aa635d60880ad40a; no voters: 
I20251024 08:16:03.429111 18828 leader_election.cc:290] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [CANDIDATE]: Term 1 election: Requested vote from peers 
I20251024 08:16:03.429168 18831 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.429314 18828 sys_catalog.cc:565] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: configured and running, proceeding with master startup.
I20251024 08:16:03.429320 18831 raft_consensus.cc:697] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 LEADER]: Becoming Leader. State: Replica: 32f4918bb4bc4166aa635d60880ad40a, State: Running, Role: LEADER
I20251024 08:16:03.429420 18831 consensus_queue.cc:237] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:03.429749 18832 sys_catalog.cc:455] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "32f4918bb4bc4166aa635d60880ad40a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } } }
I20251024 08:16:03.429775 18833 sys_catalog.cc:455] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: SysCatalogTable state changed. Reason: New leader 32f4918bb4bc4166aa635d60880ad40a. Latest consensus state: current_term: 1 leader_uuid: "32f4918bb4bc4166aa635d60880ad40a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } } }
I20251024 08:16:03.429903 18832 sys_catalog.cc:458] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: This master's current role is: LEADER
I20251024 08:16:03.430173 18840 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251024 08:16:03.429911 18833 sys_catalog.cc:458] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: This master's current role is: LEADER
I20251024 08:16:03.430732 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 18761
I20251024 08:16:03.430814 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal/instance
I20251024 08:16:03.431910 18840 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251024 08:16:03.433511 18840 catalog_manager.cc:1357] Generated new cluster ID: 52bb1c2303b84fa3a260544ca2ff0d4e
I20251024 08:16:03.433560 18840 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251024 08:16:03.444648 18840 catalog_manager.cc:1380] Generated new certificate authority record
I20251024 08:16:03.445214 18840 catalog_manager.cc:1514] Loading token signing keys...
I20251024 08:16:03.454071 18840 catalog_manager.cc:6022] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Generated new TSK 0
I20251024 08:16:03.454259 18840 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251024 08:16:03.459444 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:0
--local_ip_for_outbound_sockets=127.18.80.65
--webserver_interface=127.18.80.65
--webserver_port=0
--tserver_master_addrs=127.18.80.126:40485
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251024 08:16:03.534219 18852 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:03.534390 18852 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:03.534416 18852 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251024 08:16:03.534436 18852 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:03.535749 18852 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:03.535810 18852 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:03.537254 18852 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.18.80.126:40485
--never_fsync=true
--heap_profile_path=/tmp/kudu.18852
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:03.537477 18852 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:03.537698 18852 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:03.540247 18858 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.540251 18857 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.540352 18860 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:03.540462 18852 server_base.cc:1047] running on GCE node
I20251024 08:16:03.540632 18852 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:03.540859 18852 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:03.542050 18852 hybrid_clock.cc:648] HybridClock initialized: now 1761293763542037 us; error 26 us; skew 500 ppm
I20251024 08:16:03.543467 18852 webserver.cc:492] Webserver started at http://127.18.80.65:43829/ using document root <none> and password file <none>
I20251024 08:16:03.543685 18852 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:03.543737 18852 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:03.543864 18852 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:03.544940 18852 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/instance:
uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.545286 18852 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal/instance:
uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.546743 18852 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:03.547586 18866 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.547781 18852 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:03.547850 18852 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.547914 18852 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:03.559518 18852 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:03.559744 18852 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:03.559837 18852 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:03.560048 18852 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:03.567090 18852 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:03.567196 18852 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.567256 18852 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:03.567286 18852 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.573076 18852 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:34683
I20251024 08:16:03.573160 18979 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:34683 every 8 connection(s)
I20251024 08:16:03.573474 18852 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:03.577086 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 18852
I20251024 08:16:03.577152 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal/instance
I20251024 08:16:03.578238 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:0
--local_ip_for_outbound_sockets=127.18.80.66
--webserver_interface=127.18.80.66
--webserver_port=0
--tserver_master_addrs=127.18.80.126:40485
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20251024 08:16:03.578552 18980 heartbeater.cc:344] Connected to a master server at 127.18.80.126:40485
I20251024 08:16:03.578650 18980 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:03.578866 18980 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:03.579321 18792 ts_manager.cc:194] Registered new tserver with Master: 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683)
I20251024 08:16:03.580214 18792 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:57371
W20251024 08:16:03.653558 18983 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:03.653728 18983 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:03.653749 18983 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251024 08:16:03.653772 18983 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:03.655089 18983 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:03.655149 18983 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:03.656554 18983 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.18.80.126:40485
--never_fsync=true
--heap_profile_path=/tmp/kudu.18983
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:03.656816 18983 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:03.657061 18983 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:03.659929 18989 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.659947 18988 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:03.659967 18983 server_base.cc:1047] running on GCE node
W20251024 08:16:03.659950 18991 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:03.660305 18983 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:03.660490 18983 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:03.661608 18983 hybrid_clock.cc:648] HybridClock initialized: now 1761293763661605 us; error 32 us; skew 500 ppm
I20251024 08:16:03.662802 18983 webserver.cc:492] Webserver started at http://127.18.80.66:36293/ using document root <none> and password file <none>
I20251024 08:16:03.663003 18983 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:03.663043 18983 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:03.663146 18983 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:03.663950 18983 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data/instance:
uuid: "1ca2674bda43456db1ddae2032441d86"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.664249 18983 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal/instance:
uuid: "1ca2674bda43456db1ddae2032441d86"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.665488 18983 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251024 08:16:03.666234 18997 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.666397 18983 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:03.666474 18983 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "1ca2674bda43456db1ddae2032441d86"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.666536 18983 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:03.682605 18983 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:03.682878 18983 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:03.682986 18983 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:03.683197 18983 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:03.683501 18983 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:03.683535 18983 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.683558 18983 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:03.683573 18983 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.689307 18983 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:44791
I20251024 08:16:03.689354 19110 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:44791 every 8 connection(s)
I20251024 08:16:03.689695 18983 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:03.692481 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 18983
I20251024 08:16:03.692545 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-1/wal/instance
I20251024 08:16:03.693559 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:0
--local_ip_for_outbound_sockets=127.18.80.67
--webserver_interface=127.18.80.67
--webserver_port=0
--tserver_master_addrs=127.18.80.126:40485
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20251024 08:16:03.694191 19111 heartbeater.cc:344] Connected to a master server at 127.18.80.126:40485
I20251024 08:16:03.694278 19111 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:03.694463 19111 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:03.694787 18792 ts_manager.cc:194] Registered new tserver with Master: 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:03.695133 18792 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:54459
W20251024 08:16:03.768496 19114 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:03.768674 19114 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:03.768693 19114 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251024 08:16:03.768708 19114 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:03.770071 19114 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:03.770124 19114 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:03.771728 19114 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.18.80.126:40485
--never_fsync=true
--heap_profile_path=/tmp/kudu.19114
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:03.771929 19114 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:03.772109 19114 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:03.774600 19120 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.774677 19122 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:03.774713 19119 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:03.774904 19114 server_base.cc:1047] running on GCE node
I20251024 08:16:03.775069 19114 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:03.775293 19114 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:03.776432 19114 hybrid_clock.cc:648] HybridClock initialized: now 1761293763776409 us; error 37 us; skew 500 ppm
I20251024 08:16:03.777561 19114 webserver.cc:492] Webserver started at http://127.18.80.67:34269/ using document root <none> and password file <none>
I20251024 08:16:03.777756 19114 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:03.777832 19114 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:03.777941 19114 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:03.778781 19114 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data/instance:
uuid: "124688609f6a4035893a031320ea1d52"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.779089 19114 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal/instance:
uuid: "124688609f6a4035893a031320ea1d52"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.780169 19114 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251024 08:16:03.780826 19128 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.781015 19114 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:03.781081 19114 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "124688609f6a4035893a031320ea1d52"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:03.781141 19114 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:03.794178 19114 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:03.794440 19114 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:03.794561 19114 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:03.794754 19114 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:03.795050 19114 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:03.795081 19114 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.795111 19114 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:03.795130 19114 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.800534 19114 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:45549
I20251024 08:16:03.800580 19241 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:45549 every 8 connection(s)
I20251024 08:16:03.800865 19114 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:03.805234 19242 heartbeater.cc:344] Connected to a master server at 127.18.80.126:40485
I20251024 08:16:03.805326 19242 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:03.805483 19242 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:03.805820 18792 ts_manager.cc:194] Registered new tserver with Master: 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.806123 18792 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:44157
I20251024 08:16:03.807967 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 19114
I20251024 08:16:03.808037 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-2/wal/instance
I20251024 08:16:03.809154 18753 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20251024 08:16:03.814617 18753 test_util.cc:276] Using random seed: 690169046
I20251024 08:16:03.820350 18792 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:50004:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20251024 08:16:03.820649 18792 catalog_manager.cc:7011] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20251024 08:16:03.827332 19043 tablet_service.cc:1505] Processing CreateTablet for tablet f897508fbb474c0eab2ead7fb82d2e0e (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251024 08:16:03.827332 18912 tablet_service.cc:1505] Processing CreateTablet for tablet f897508fbb474c0eab2ead7fb82d2e0e (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251024 08:16:03.827333 18914 tablet_service.cc:1505] Processing CreateTablet for tablet 39d918845bd2430ea45706346c3a38d2 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251024 08:16:03.827579 19044 tablet_service.cc:1505] Processing CreateTablet for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251024 08:16:03.827666 19042 tablet_service.cc:1505] Processing CreateTablet for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251024 08:16:03.827693 18911 tablet_service.cc:1505] Processing CreateTablet for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251024 08:16:03.827752 19041 tablet_service.cc:1505] Processing CreateTablet for tablet aa7ed2496d2143e0ac89caee8c72dbf4 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251024 08:16:03.827773 18914 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 39d918845bd2430ea45706346c3a38d2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.827857 18910 tablet_service.cc:1505] Processing CreateTablet for tablet aa7ed2496d2143e0ac89caee8c72dbf4 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251024 08:16:03.827886 19040 tablet_service.cc:1505] Processing CreateTablet for tablet 91d51fadf3f14a5897bf73a1492c5b29 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251024 08:16:03.827926 18910 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet aa7ed2496d2143e0ac89caee8c72dbf4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.827332 19045 tablet_service.cc:1505] Processing CreateTablet for tablet 39d918845bd2430ea45706346c3a38d2 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251024 08:16:03.827764 19043 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f897508fbb474c0eab2ead7fb82d2e0e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.828516 18909 tablet_service.cc:1505] Processing CreateTablet for tablet 91d51fadf3f14a5897bf73a1492c5b29 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251024 08:16:03.828634 18909 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 91d51fadf3f14a5897bf73a1492c5b29. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.828845 18911 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.827358 18913 tablet_service.cc:1505] Processing CreateTablet for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251024 08:16:03.830062 18913 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.830775 19040 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 91d51fadf3f14a5897bf73a1492c5b29. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.831769 19042 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.832034 18912 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f897508fbb474c0eab2ead7fb82d2e0e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.832499 19041 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet aa7ed2496d2143e0ac89caee8c72dbf4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.833050 19262 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: Bootstrap starting.
I20251024 08:16:03.833392 19044 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.833702 19262 tablet_bootstrap.cc:654] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.834025 19262 log.cc:826] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:03.834357 19261 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:03.834697 19262 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: No bootstrap required, opened a new log
I20251024 08:16:03.834755 19262 ts_tablet_manager.cc:1403] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:03.834915 19045 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 39d918845bd2430ea45706346c3a38d2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.835182 19261 tablet_bootstrap.cc:654] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.835445 19261 log.cc:826] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:03.835938 19262 raft_consensus.cc:359] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.836055 19262 raft_consensus.cc:385] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.836124 19261 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: No bootstrap required, opened a new log
I20251024 08:16:03.836176 19262 raft_consensus.cc:740] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.836309 19262 consensus_queue.cc:260] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.836376 19261 ts_tablet_manager.cc:1403] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:03.836676 19111 heartbeater.cc:499] Master 127.18.80.126:40485 was elected leader, sending a full tablet report...
I20251024 08:16:03.836964 19262 ts_tablet_manager.cc:1434] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:03.837098 19262 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86: Bootstrap starting.
I20251024 08:16:03.837543 19262 tablet_bootstrap.cc:654] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.837746 19176 tablet_service.cc:1505] Processing CreateTablet for tablet 39d918845bd2430ea45706346c3a38d2 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251024 08:16:03.837898 19261 raft_consensus.cc:359] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.838018 19261 raft_consensus.cc:385] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.838052 19261 raft_consensus.cc:740] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.838102 19176 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 39d918845bd2430ea45706346c3a38d2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.838146 19262 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86: No bootstrap required, opened a new log
I20251024 08:16:03.838145 19261 consensus_queue.cc:260] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.838187 19262 ts_tablet_manager.cc:1403] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:03.838310 19262 raft_consensus.cc:359] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.838368 19262 raft_consensus.cc:385] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.838392 19262 raft_consensus.cc:740] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.838382 19261 ts_tablet_manager.cc:1434] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:03.838423 18980 heartbeater.cc:499] Master 127.18.80.126:40485 was elected leader, sending a full tablet report...
I20251024 08:16:03.838433 19262 consensus_queue.cc:260] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.838462 19261 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:03.838507 19262 ts_tablet_manager.cc:1434] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.838562 19262 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86: Bootstrap starting.
I20251024 08:16:03.838873 19261 tablet_bootstrap.cc:654] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.839003 19262 tablet_bootstrap.cc:654] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.839682 19262 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86: No bootstrap required, opened a new log
I20251024 08:16:03.839728 19262 ts_tablet_manager.cc:1403] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:03.839864 19262 raft_consensus.cc:359] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.839917 19262 raft_consensus.cc:385] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.839938 19262 raft_consensus.cc:740] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.839974 19262 consensus_queue.cc:260] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.840044 19262 ts_tablet_manager.cc:1434] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.840086 19262 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86: Bootstrap starting.
I20251024 08:16:03.840432 19270 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52: Bootstrap starting.
I20251024 08:16:03.840504 19175 tablet_service.cc:1505] Processing CreateTablet for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251024 08:16:03.840579 19175 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.840607 19262 tablet_bootstrap.cc:654] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.841003 19270 tablet_bootstrap.cc:654] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.841114 19262 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86: No bootstrap required, opened a new log
I20251024 08:16:03.841151 19262 ts_tablet_manager.cc:1403] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:03.841245 19270 log.cc:826] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:03.841261 19262 raft_consensus.cc:359] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.841295 19262 raft_consensus.cc:385] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.841305 19262 raft_consensus.cc:740] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.841344 19262 consensus_queue.cc:260] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.841420 19262 ts_tablet_manager.cc:1434] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.841480 19262 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86: Bootstrap starting.
I20251024 08:16:03.841516 19272 raft_consensus.cc:493] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.841574 19272 raft_consensus.cc:515] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.841776 19174 tablet_service.cc:1505] Processing CreateTablet for tablet f897508fbb474c0eab2ead7fb82d2e0e (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251024 08:16:03.841884 19262 tablet_bootstrap.cc:654] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.841890 19174 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f897508fbb474c0eab2ead7fb82d2e0e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.842546 19272 leader_election.cc:290] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.842582 19270 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52: No bootstrap required, opened a new log
I20251024 08:16:03.842636 19270 ts_tablet_manager.cc:1403] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52: Time spent bootstrapping tablet: real 0.002s	user 0.000s	sys 0.001s
I20251024 08:16:03.842989 19262 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86: No bootstrap required, opened a new log
I20251024 08:16:03.843024 19262 ts_tablet_manager.cc:1403] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:03.843111 19173 tablet_service.cc:1505] Processing CreateTablet for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251024 08:16:03.843153 19262 raft_consensus.cc:359] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.843197 19262 raft_consensus.cc:385] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.843192 19173 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.843214 19262 raft_consensus.cc:740] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.843266 19262 consensus_queue.cc:260] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.843341 19262 ts_tablet_manager.cc:1434] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.843393 19262 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86: Bootstrap starting.
I20251024 08:16:03.843742 19262 tablet_bootstrap.cc:654] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.844537 19172 tablet_service.cc:1505] Processing CreateTablet for tablet aa7ed2496d2143e0ac89caee8c72dbf4 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251024 08:16:03.844614 19172 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet aa7ed2496d2143e0ac89caee8c72dbf4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.844862 19270 raft_consensus.cc:359] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.845005 19270 raft_consensus.cc:385] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.845033 19270 raft_consensus.cc:740] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 124688609f6a4035893a031320ea1d52, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.845170 19270 consensus_queue.cc:260] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.845402 19270 ts_tablet_manager.cc:1434] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52: Time spent starting tablet: real 0.003s	user 0.002s	sys 0.001s
I20251024 08:16:03.845451 19242 heartbeater.cc:499] Master 127.18.80.126:40485 was elected leader, sending a full tablet report...
I20251024 08:16:03.845471 19270 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52: Bootstrap starting.
I20251024 08:16:03.845752 19171 tablet_service.cc:1505] Processing CreateTablet for tablet 91d51fadf3f14a5897bf73a1492c5b29 (DEFAULT_TABLE table=test-workload [id=03d3358e53e6427b8ad498da6c2e9168]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251024 08:16:03.845837 19171 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 91d51fadf3f14a5897bf73a1492c5b29. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:03.845875 19270 tablet_bootstrap.cc:654] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.846344 19270 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52: No bootstrap required, opened a new log
I20251024 08:16:03.846380 19270 ts_tablet_manager.cc:1403] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:03.846511 19270 raft_consensus.cc:359] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.846580 19270 raft_consensus.cc:385] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.846599 19270 raft_consensus.cc:740] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 124688609f6a4035893a031320ea1d52, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.846642 19270 consensus_queue.cc:260] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.846714 19270 ts_tablet_manager.cc:1434] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.846756 19270 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52: Bootstrap starting.
I20251024 08:16:03.847119 19270 tablet_bootstrap.cc:654] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.847373 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" is_pre_election: true
I20251024 08:16:03.847708 19270 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52: No bootstrap required, opened a new log
I20251024 08:16:03.847752 19270 ts_tablet_manager.cc:1403] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.000s
W20251024 08:16:03.847730 19000 leader_election.cc:343] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Tablet error from VoteRequest() call to peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:03.847911 19270 raft_consensus.cc:359] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.847965 19270 raft_consensus.cc:385] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.847991 19270 raft_consensus.cc:740] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 124688609f6a4035893a031320ea1d52, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.848022 19270 consensus_queue.cc:260] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.848093 19270 ts_tablet_manager.cc:1434] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.848138 19270 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52: Bootstrap starting.
I20251024 08:16:03.848780 19270 tablet_bootstrap.cc:654] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.849056 19262 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86: No bootstrap required, opened a new log
I20251024 08:16:03.849105 19262 ts_tablet_manager.cc:1403] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86: Time spent bootstrapping tablet: real 0.006s	user 0.001s	sys 0.000s
I20251024 08:16:03.849267 19262 raft_consensus.cc:359] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.849308 19261 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: No bootstrap required, opened a new log
I20251024 08:16:03.849330 19262 raft_consensus.cc:385] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.849349 19262 raft_consensus.cc:740] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.849350 19261 ts_tablet_manager.cc:1403] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.011s	user 0.001s	sys 0.000s
I20251024 08:16:03.849406 19262 consensus_queue.cc:260] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.849488 19262 ts_tablet_manager.cc:1434] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.849488 19261 raft_consensus.cc:359] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.849555 19261 raft_consensus.cc:385] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.849575 19261 raft_consensus.cc:740] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.849615 19261 consensus_queue.cc:260] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.849676 19261 ts_tablet_manager.cc:1434] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.849802 19261 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:03.850230 19261 tablet_bootstrap.cc:654] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.850901 19270 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52: No bootstrap required, opened a new log
I20251024 08:16:03.850943 19270 ts_tablet_manager.cc:1403] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52: Time spent bootstrapping tablet: real 0.003s	user 0.000s	sys 0.001s
I20251024 08:16:03.851085 19261 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: No bootstrap required, opened a new log
I20251024 08:16:03.851081 19270 raft_consensus.cc:359] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.851121 19261 ts_tablet_manager.cc:1403] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:03.851138 19270 raft_consensus.cc:385] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.851159 19270 raft_consensus.cc:740] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 124688609f6a4035893a031320ea1d52, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.851213 19270 consensus_queue.cc:260] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.851298 19270 ts_tablet_manager.cc:1434] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.851277 19261 raft_consensus.cc:359] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.851331 19261 raft_consensus.cc:385] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.851348 19270 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52: Bootstrap starting.
I20251024 08:16:03.851352 19261 raft_consensus.cc:740] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.851397 19261 consensus_queue.cc:260] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.851481 19261 ts_tablet_manager.cc:1434] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:03.851578 19261 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:03.851807 19270 tablet_bootstrap.cc:654] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.851950 19261 tablet_bootstrap.cc:654] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.852067 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:03.852191 19196 raft_consensus.cc:2468] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 0.
I20251024 08:16:03.852304 19270 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52: No bootstrap required, opened a new log
I20251024 08:16:03.852337 19270 ts_tablet_manager.cc:1403] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:03.852375 18999 leader_election.cc:304] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 01590ebaf5524b53b67dbcd5ce628ab0
I20251024 08:16:03.852487 19270 raft_consensus.cc:359] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.852533 19270 raft_consensus.cc:385] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.852556 19270 raft_consensus.cc:740] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 124688609f6a4035893a031320ea1d52, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.852540 19272 raft_consensus.cc:2804] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:03.852602 19272 raft_consensus.cc:493] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:03.852627 19272 raft_consensus.cc:3060] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.852608 19270 consensus_queue.cc:260] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.852638 19261 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: No bootstrap required, opened a new log
I20251024 08:16:03.852679 19261 ts_tablet_manager.cc:1403] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:03.852684 19270 ts_tablet_manager.cc:1434] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.852743 19270 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52: Bootstrap starting.
I20251024 08:16:03.852799 19261 raft_consensus.cc:359] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.852859 19261 raft_consensus.cc:385] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.852880 19261 raft_consensus.cc:740] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.852938 19261 consensus_queue.cc:260] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.853015 19261 ts_tablet_manager.cc:1434] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.853058 19261 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:03.853125 19270 tablet_bootstrap.cc:654] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.853456 19261 tablet_bootstrap.cc:654] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.853541 19272 raft_consensus.cc:515] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.853672 19272 leader_election.cc:290] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Requested vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.853946 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
I20251024 08:16:03.854039 19261 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: No bootstrap required, opened a new log
I20251024 08:16:03.854089 19261 ts_tablet_manager.cc:1403] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
W20251024 08:16:03.854131 19000 leader_election.cc:343] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Tablet error from VoteRequest() call to peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:03.854233 19261 raft_consensus.cc:359] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.854287 19261 raft_consensus.cc:385] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.854310 19261 raft_consensus.cc:740] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.854348 19261 consensus_queue.cc:260] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.854445 19261 ts_tablet_manager.cc:1434] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.854455 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52"
I20251024 08:16:03.854498 19261 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:03.854514 19196 raft_consensus.cc:3060] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.855067 19196 raft_consensus.cc:2468] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 1.
I20251024 08:16:03.855113 19261 tablet_bootstrap.cc:654] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:03.855222 18999 leader_election.cc:304] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 01590ebaf5524b53b67dbcd5ce628ab0
I20251024 08:16:03.855298 19272 raft_consensus.cc:2804] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.855341 19272 raft_consensus.cc:697] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 1 LEADER]: Becoming Leader. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Running, Role: LEADER
I20251024 08:16:03.855407 19272 consensus_queue.cc:237] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.855760 19261 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: No bootstrap required, opened a new log
I20251024 08:16:03.855808 19261 ts_tablet_manager.cc:1403] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:03.855870 19270 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52: No bootstrap required, opened a new log
I20251024 08:16:03.855909 19270 ts_tablet_manager.cc:1403] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52: Time spent bootstrapping tablet: real 0.003s	user 0.001s	sys 0.000s
I20251024 08:16:03.855906 19261 raft_consensus.cc:359] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.855957 19261 raft_consensus.cc:385] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.855979 19261 raft_consensus.cc:740] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.856014 19261 consensus_queue.cc:260] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.856035 19270 raft_consensus.cc:359] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.856084 19261 ts_tablet_manager.cc:1434] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.856091 19270 raft_consensus.cc:385] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:03.856112 19270 raft_consensus.cc:740] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 124688609f6a4035893a031320ea1d52, State: Initialized, Role: FOLLOWER
I20251024 08:16:03.856149 19270 consensus_queue.cc:260] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.856225 19270 ts_tablet_manager.cc:1434] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:03.856129 18791 catalog_manager.cc:5649] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 reported cstate change: term changed from 0 to 1, leader changed from <none> to 1ca2674bda43456db1ddae2032441d86 (127.18.80.66). New cstate: current_term: 1 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:03.856666 19272 raft_consensus.cc:493] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.856719 19272 raft_consensus.cc:515] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.856830 19272 leader_election.cc:290] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.856973 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e9dd8c4736b4a428f5ed6b8e81d2d81" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:03.857033 19196 raft_consensus.cc:2468] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 0.
I20251024 08:16:03.857014 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e9dd8c4736b4a428f5ed6b8e81d2d81" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" is_pre_election: true
I20251024 08:16:03.857115 18934 raft_consensus.cc:2468] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 0.
I20251024 08:16:03.857156 18999 leader_election.cc:304] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.857219 19272 raft_consensus.cc:2804] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:03.857252 19272 raft_consensus.cc:493] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:03.857276 19272 raft_consensus.cc:3060] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.857725 19272 raft_consensus.cc:515] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.857862 19272 leader_election.cc:290] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Requested vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.858002 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e9dd8c4736b4a428f5ed6b8e81d2d81" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
I20251024 08:16:03.858036 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e9dd8c4736b4a428f5ed6b8e81d2d81" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52"
I20251024 08:16:03.858085 19196 raft_consensus.cc:3060] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.858085 18934 raft_consensus.cc:3060] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.858716 19196 raft_consensus.cc:2468] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 1.
I20251024 08:16:03.858855 18934 raft_consensus.cc:2468] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 1.
I20251024 08:16:03.858883 18999 leader_election.cc:304] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.858951 19272 raft_consensus.cc:2804] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.858992 19272 raft_consensus.cc:697] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 1 LEADER]: Becoming Leader. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Running, Role: LEADER
I20251024 08:16:03.859052 19272 consensus_queue.cc:237] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.859556 18791 catalog_manager.cc:5649] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 reported cstate change: term changed from 0 to 1, leader changed from <none> to 1ca2674bda43456db1ddae2032441d86 (127.18.80.66). New cstate: current_term: 1 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:03.860826 19277 raft_consensus.cc:493] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.860873 19277 raft_consensus.cc:515] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.861127 19277 leader_election.cc:290] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:03.864010 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "90fe2a8a32ae4fd3a0574ccd6f5ec935" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" is_pre_election: true
I20251024 08:16:03.864053 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "90fe2a8a32ae4fd3a0574ccd6f5ec935" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:03.864099 18934 raft_consensus.cc:2468] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 0.
I20251024 08:16:03.864128 19065 raft_consensus.cc:2468] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 0.
I20251024 08:16:03.864266 19131 leader_election.cc:304] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0, 124688609f6a4035893a031320ea1d52; no voters: 
I20251024 08:16:03.864373 19277 raft_consensus.cc:2804] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:03.864423 19277 raft_consensus.cc:493] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:03.864449 19277 raft_consensus.cc:3060] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.865028 19277 raft_consensus.cc:515] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.865126 19277 leader_election.cc:290] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 election: Requested vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:03.865321 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "90fe2a8a32ae4fd3a0574ccd6f5ec935" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86"
I20251024 08:16:03.865378 19065 raft_consensus.cc:3060] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.865895 19065 raft_consensus.cc:2468] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 1.
I20251024 08:16:03.865980 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "90fe2a8a32ae4fd3a0574ccd6f5ec935" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
I20251024 08:16:03.866047 18934 raft_consensus.cc:3060] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.866036 19131 leader_election.cc:304] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.866119 19277 raft_consensus.cc:2804] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.866254 19277 raft_consensus.cc:697] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 1 LEADER]: Becoming Leader. State: Replica: 124688609f6a4035893a031320ea1d52, State: Running, Role: LEADER
I20251024 08:16:03.866343 19277 consensus_queue.cc:237] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.866619 18934 raft_consensus.cc:2468] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 1.
I20251024 08:16:03.866994 18791 catalog_manager.cc:5649] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 reported cstate change: term changed from 0 to 1, leader changed from <none> to 124688609f6a4035893a031320ea1d52 (127.18.80.67). New cstate: current_term: 1 leader_uuid: "124688609f6a4035893a031320ea1d52" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: HEALTHY } } }
I20251024 08:16:03.872735 19265 raft_consensus.cc:493] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.872812 19265 raft_consensus.cc:515] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.872934 19265 leader_election.cc:290] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.873047 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "39d918845bd2430ea45706346c3a38d2" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" is_pre_election: true
I20251024 08:16:03.873096 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "39d918845bd2430ea45706346c3a38d2" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:03.873127 18934 raft_consensus.cc:2468] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 0.
I20251024 08:16:03.873164 19196 raft_consensus.cc:2468] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 0.
I20251024 08:16:03.873253 19000 leader_election.cc:304] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.873370 19265 raft_consensus.cc:2804] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:03.873405 19265 raft_consensus.cc:493] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:03.873426 19265 raft_consensus.cc:3060] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.873927 19265 raft_consensus.cc:515] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.874024 19265 leader_election.cc:290] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Requested vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.874163 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "39d918845bd2430ea45706346c3a38d2" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52"
I20251024 08:16:03.874176 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "39d918845bd2430ea45706346c3a38d2" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
I20251024 08:16:03.874220 18934 raft_consensus.cc:3060] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.874217 19196 raft_consensus.cc:3060] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.874672 19196 raft_consensus.cc:2468] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 1.
I20251024 08:16:03.874692 18934 raft_consensus.cc:2468] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 1.
I20251024 08:16:03.874845 18999 leader_election.cc:304] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.874962 19265 raft_consensus.cc:2804] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.875003 19265 raft_consensus.cc:697] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 1 LEADER]: Becoming Leader. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Running, Role: LEADER
I20251024 08:16:03.875049 19265 consensus_queue.cc:237] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.875641 18791 catalog_manager.cc:5649] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 reported cstate change: term changed from 0 to 1, leader changed from <none> to 1ca2674bda43456db1ddae2032441d86 (127.18.80.66). New cstate: current_term: 1 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:03.881453 19277 raft_consensus.cc:493] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.881532 19277 raft_consensus.cc:515] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.881662 19277 leader_election.cc:290] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:03.881880 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "91d51fadf3f14a5897bf73a1492c5b29" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" is_pre_election: true
I20251024 08:16:03.881889 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "91d51fadf3f14a5897bf73a1492c5b29" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:03.881964 19065 raft_consensus.cc:2468] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 0.
I20251024 08:16:03.881964 18934 raft_consensus.cc:2468] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 0.
I20251024 08:16:03.882121 19131 leader_election.cc:304] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.882258 19277 raft_consensus.cc:2804] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:03.882305 19277 raft_consensus.cc:493] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:03.882333 19277 raft_consensus.cc:3060] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.882607 19267 raft_consensus.cc:493] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.882664 19267 raft_consensus.cc:515] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.882965 19267 leader_election.cc:290] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.883005 19277 raft_consensus.cc:515] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.883154 19277 leader_election.cc:290] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 election: Requested vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:03.883363 18934 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "91d51fadf3f14a5897bf73a1492c5b29" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
I20251024 08:16:03.883427 18934 raft_consensus.cc:3060] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.883388 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "91d51fadf3f14a5897bf73a1492c5b29" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86"
I20251024 08:16:03.883461 19065 raft_consensus.cc:3060] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.884034 18934 raft_consensus.cc:2468] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 1.
I20251024 08:16:03.884171 19131 leader_election.cc:304] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0, 124688609f6a4035893a031320ea1d52; no voters: 
I20251024 08:16:03.884251 19277 raft_consensus.cc:2804] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.884282 19277 raft_consensus.cc:697] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 1 LEADER]: Becoming Leader. State: Replica: 124688609f6a4035893a031320ea1d52, State: Running, Role: LEADER
I20251024 08:16:03.884289 19065 raft_consensus.cc:2468] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 124688609f6a4035893a031320ea1d52 in term 1.
I20251024 08:16:03.884320 19277 consensus_queue.cc:237] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.884876 18792 catalog_manager.cc:5649] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 reported cstate change: term changed from 0 to 1, leader changed from <none> to 124688609f6a4035893a031320ea1d52 (127.18.80.67). New cstate: current_term: 1 leader_uuid: "124688609f6a4035893a031320ea1d52" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: HEALTHY } } }
I20251024 08:16:03.886063 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:03.886686 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:03.886770 19196 raft_consensus.cc:2393] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 01590ebaf5524b53b67dbcd5ce628ab0 in current term 1: Already voted for candidate 1ca2674bda43456db1ddae2032441d86 in this term.
I20251024 08:16:03.887009 18868 leader_election.cc:304] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0; no voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86
I20251024 08:16:03.887146 19267 raft_consensus.cc:3060] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.887682 19267 raft_consensus.cc:2749] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20251024 08:16:03.907685 19267 raft_consensus.cc:493] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:03.907825 19267 raft_consensus.cc:515] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.907968 19267 leader_election.cc:290] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.908205 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:03.908213 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:03.908288 19065 raft_consensus.cc:2468] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 01590ebaf5524b53b67dbcd5ce628ab0 in term 0.
I20251024 08:16:03.908300 19196 raft_consensus.cc:2468] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 01590ebaf5524b53b67dbcd5ce628ab0 in term 0.
I20251024 08:16:03.908447 18868 leader_election.cc:304] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0, 124688609f6a4035893a031320ea1d52; no voters: 
I20251024 08:16:03.908566 19267 raft_consensus.cc:2804] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:03.908638 19267 raft_consensus.cc:493] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:03.908667 19267 raft_consensus.cc:3060] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.909322 19267 raft_consensus.cc:515] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.909427 19267 leader_election.cc:290] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 1 election: Requested vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:03.909582 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52"
I20251024 08:16:03.909602 19065 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86"
I20251024 08:16:03.909651 19196 raft_consensus.cc:3060] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.909657 19065 raft_consensus.cc:3060] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:03.910246 19196 raft_consensus.cc:2468] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 01590ebaf5524b53b67dbcd5ce628ab0 in term 1.
I20251024 08:16:03.910245 19065 raft_consensus.cc:2468] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 01590ebaf5524b53b67dbcd5ce628ab0 in term 1.
I20251024 08:16:03.910460 18869 leader_election.cc:304] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:03.910578 19267 raft_consensus.cc:2804] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:03.910773 19267 raft_consensus.cc:697] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 LEADER]: Becoming Leader. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Running, Role: LEADER
I20251024 08:16:03.910884 19267 consensus_queue.cc:237] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:03.911686 18792 catalog_manager.cc:5649] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 reported cstate change: term changed from 0 to 1, leader changed from <none> to 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65). New cstate: current_term: 1 leader_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:03.920574 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:0
--local_ip_for_outbound_sockets=127.18.80.68
--webserver_interface=127.18.80.68
--webserver_port=0
--tserver_master_addrs=127.18.80.126:40485
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
I20251024 08:16:03.933007 19196 raft_consensus.cc:1275] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.933058 18934 raft_consensus.cc:1275] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.933483 19272 consensus_queue.cc:1048] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.933995 19195 raft_consensus.cc:1275] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.934171 19265 consensus_queue.cc:1048] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.934304 19265 consensus_queue.cc:1048] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.934571 18933 raft_consensus.cc:1275] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.936028 18934 raft_consensus.cc:1275] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 3. (index mismatch)
I20251024 08:16:03.936236 19265 consensus_queue.cc:1048] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.936285 18933 raft_consensus.cc:1275] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Refusing update from remote peer 124688609f6a4035893a031320ea1d52: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.936345 19265 consensus_queue.cc:1048] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.936450 18934 raft_consensus.cc:1275] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Refusing update from remote peer 124688609f6a4035893a031320ea1d52: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.936815 19064 raft_consensus.cc:1275] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Refusing update from remote peer 124688609f6a4035893a031320ea1d52: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.936921 19277 consensus_queue.cc:1048] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Connected to new peer: Peer: permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.937031 19291 consensus_queue.cc:1048] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Connected to new peer: Peer: permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.936794 19065 raft_consensus.cc:1275] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Refusing update from remote peer 124688609f6a4035893a031320ea1d52: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.937373 19291 consensus_queue.cc:1048] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Connected to new peer: Peer: permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.937390 19277 consensus_queue.cc:1048] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Connected to new peer: Peer: permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.937965 19196 raft_consensus.cc:1275] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 3. (index mismatch)
I20251024 08:16:03.938186 19265 consensus_queue.cc:1048] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20251024 08:16:03.940232 19112 tablet.cc:2378] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20251024 08:16:03.940305 19195 raft_consensus.cc:1275] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 01590ebaf5524b53b67dbcd5ce628ab0: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
W20251024 08:16:03.940338 19112 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20251024 08:16:03.940512 19296 consensus_queue.cc:1048] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.940596 19314 mvcc.cc:204] Tried to move back new op lower bound from 7214259257072979968 to 7214259256832610304. Current Snapshot: MvccSnapshot[applied={T|T < 7214259257066778624}]
I20251024 08:16:03.941183 19308 mvcc.cc:204] Tried to move back new op lower bound from 7214259257072979968 to 7214259256832610304. Current Snapshot: MvccSnapshot[applied={T|T < 7214259257066778624}]
I20251024 08:16:03.941224 19063 raft_consensus.cc:1275] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Refusing update from remote peer 01590ebaf5524b53b67dbcd5ce628ab0: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:03.942126 19267 consensus_queue.cc:1048] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:03.942310 19318 mvcc.cc:204] Tried to move back new op lower bound from 7214259257084076032 to 7214259256796958720. Current Snapshot: MvccSnapshot[applied={T|T < 7214259257078628352}]
W20251024 08:16:04.051481 19243 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251024 08:16:04.066910 19302 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:04.067142 19302 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:04.067173 19302 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251024 08:16:04.067200 19302 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:04.069577 19302 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:04.069677 19302 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:04.072165 19302 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.18.80.126:40485
--never_fsync=true
--heap_profile_path=/tmp/kudu.19302
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:04.072430 19302 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:04.072700 19302 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:04.074247 18981 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251024 08:16:04.075659 19371 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:04.076332 19369 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:04.076393 19368 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:04.077988 19302 server_base.cc:1047] running on GCE node
I20251024 08:16:04.078213 19302 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:04.078471 19302 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:04.081705 19302 hybrid_clock.cc:648] HybridClock initialized: now 1761293764081649 us; error 66 us; skew 500 ppm
I20251024 08:16:04.083184 19302 webserver.cc:492] Webserver started at http://127.18.80.68:45359/ using document root <none> and password file <none>
I20251024 08:16:04.083416 19302 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:04.083478 19302 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:04.083613 19302 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:04.084941 19302 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data/instance:
uuid: "2aa616b606514919b4009cac3bee5b9e"
format_stamp: "Formatted at 2025-10-24 08:16:04 on dist-test-slave-13l5"
I20251024 08:16:04.085322 19302 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal/instance:
uuid: "2aa616b606514919b4009cac3bee5b9e"
format_stamp: "Formatted at 2025-10-24 08:16:04 on dist-test-slave-13l5"
I20251024 08:16:04.093865 19302 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.000s	sys 0.002s
I20251024 08:16:04.095242 19378 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:04.095530 19302 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:04.095731 19302 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "2aa616b606514919b4009cac3bee5b9e"
format_stamp: "Formatted at 2025-10-24 08:16:04 on dist-test-slave-13l5"
I20251024 08:16:04.095873 19302 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:04.109907 19302 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:04.110262 19302 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:04.110481 19302 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:04.110752 19302 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:04.111224 19302 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:04.111307 19302 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:04.111366 19302 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:04.111405 19302 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:04.118181 19302 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:35609
I20251024 08:16:04.118650 19302 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:04.126804 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 19302
I20251024 08:16:04.126883 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-3/wal/instance
I20251024 08:16:04.127753 19492 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:35609 every 8 connection(s)
I20251024 08:16:04.139686 19493 heartbeater.cc:344] Connected to a master server at 127.18.80.126:40485
I20251024 08:16:04.139808 19493 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:04.140014 19493 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:04.140614 18789 ts_manager.cc:194] Registered new tserver with Master: 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609)
I20251024 08:16:04.141194 18789 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:43063
I20251024 08:16:04.205634 18789 ts_manager.cc:295] Set tserver state for 01590ebaf5524b53b67dbcd5ce628ab0 to MAINTENANCE_MODE
I20251024 08:16:04.207023 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 18852
W20251024 08:16:04.215196 19000 connection.cc:537] server connection from 127.18.80.65:60409 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:04.215345 19000 connection.cc:537] client connection to 127.18.80.65:34683 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:04.215404 19000 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:04.215919 19131 connection.cc:700] client connection to 127.18.80.65:34683 send error: Network error: sendmsg error: Broken pipe (error 32)
W20251024 08:16:04.216009 19131 proxy.cc:239] Call had error, refreshing address and retrying: Network error: sendmsg error: Broken pipe (error 32)
W20251024 08:16:04.216454 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:04.216509 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:04.216531 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:04.216576 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:04.216619 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:04.216575 19252 connection.cc:537] client connection to 127.18.80.65:34683 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:04.216681 19252 meta_cache.cc:302] tablet aa7ed2496d2143e0ac89caee8c72dbf4: replica 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:04.220562 19022 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.221006 19022 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.223562 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.223840 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.224371 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.226692 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.236200 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.237686 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.239169 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.243723 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.246805 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.246867 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.260483 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.261943 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.266584 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.271203 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.271214 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.276547 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.296096 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.298627 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.301066 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.309643 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.312739 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.314798 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.340233 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.342757 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.343086 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.359767 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.359869 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.360671 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.398327 19022 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.398327 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.399921 19022 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.417610 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.418613 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.419692 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.460486 19022 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.462029 19022 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.462073 19019 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60580: Illegal state: replica 1ca2674bda43456db1ddae2032441d86 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.482908 19154 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.482908 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
W20251024 08:16:04.484984 19155 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:48488: Illegal state: replica 124688609f6a4035893a031320ea1d52 is not leader of this config: current role FOLLOWER
I20251024 08:16:04.507131 19351 raft_consensus.cc:493] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 01590ebaf5524b53b67dbcd5ce628ab0)
I20251024 08:16:04.507217 19351 raft_consensus.cc:515] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:04.507364 19351 leader_election.cc:290] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:04.507613 19064 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "124688609f6a4035893a031320ea1d52" candidate_term: 2 candidate_status { last_received { term: 1 index: 88 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
W20251024 08:16:04.507804 19131 leader_election.cc:336] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111)
I20251024 08:16:04.507920 19131 leader_election.cc:304] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 124688609f6a4035893a031320ea1d52; no voters: 01590ebaf5524b53b67dbcd5ce628ab0, 1ca2674bda43456db1ddae2032441d86
I20251024 08:16:04.508001 19351 raft_consensus.cc:2749] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251024 08:16:04.508191 19350 raft_consensus.cc:493] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 01590ebaf5524b53b67dbcd5ce628ab0)
I20251024 08:16:04.508255 19350 raft_consensus.cc:515] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:04.508383 19350 leader_election.cc:290] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:04.508572 19193 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 2 candidate_status { last_received { term: 1 index: 88 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:04.508661 19193 raft_consensus.cc:2468] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 1.
I20251024 08:16:04.508831 18999 leader_election.cc:304] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 
I20251024 08:16:04.508935 19350 raft_consensus.cc:2804] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20251024 08:16:04.508973 19350 raft_consensus.cc:493] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 01590ebaf5524b53b67dbcd5ce628ab0)
I20251024 08:16:04.509002 19350 raft_consensus.cc:3060] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Advancing to term 2
W20251024 08:16:04.509330 19000 leader_election.cc:336] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111)
I20251024 08:16:04.509662 19350 raft_consensus.cc:515] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:04.509776 19350 leader_election.cc:290] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 2 election: Requested vote from peers 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:04.509930 19193 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "aa7ed2496d2143e0ac89caee8c72dbf4" candidate_uuid: "1ca2674bda43456db1ddae2032441d86" candidate_term: 2 candidate_status { last_received { term: 1 index: 88 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52"
I20251024 08:16:04.510000 19193 raft_consensus.cc:3060] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Advancing to term 2
W20251024 08:16:04.510178 19000 leader_election.cc:336] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111)
I20251024 08:16:04.510702 19193 raft_consensus.cc:2468] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1ca2674bda43456db1ddae2032441d86 in term 2.
I20251024 08:16:04.510874 18999 leader_election.cc:304] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86; no voters: 01590ebaf5524b53b67dbcd5ce628ab0
I20251024 08:16:04.510957 19350 raft_consensus.cc:2804] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 2 FOLLOWER]: Leader election won for term 2
I20251024 08:16:04.511005 19350 raft_consensus.cc:697] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 2 LEADER]: Becoming Leader. State: Replica: 1ca2674bda43456db1ddae2032441d86, State: Running, Role: LEADER
I20251024 08:16:04.511080 19350 consensus_queue.cc:237] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 87, Committed index: 87, Last appended: 1.88, Last appended by leader: 88, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:04.511572 18789 catalog_manager.cc:5649] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 reported cstate change: term changed from 1 to 2, leader changed from 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65) to 1ca2674bda43456db1ddae2032441d86 (127.18.80.66). New cstate: current_term: 2 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:04.531255 19193 raft_consensus.cc:1275] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 2 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 1 index: 88. Preceding OpId from leader: term: 2 index: 90. (index mismatch)
I20251024 08:16:04.531581 19350 consensus_queue.cc:1048] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 89, Last known committed idx: 87, Time since last communication: 0.000s
W20251024 08:16:04.531711 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:04.706634 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:04.709374 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:04.717443 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:04.717778 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:04.723009 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:05.064067 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20251024 08:16:05.142673 19493 heartbeater.cc:499] Master 127.18.80.126:40485 was elected leader, sending a full tablet report...
W20251024 08:16:05.209808 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:05.211551 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:05.222239 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:05.243204 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:05.258523 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:05.525669 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:05.713433 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:05.734540 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:05.737501 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:05.767961 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:05.808983 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:06.065275 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
I20251024 08:16:06.209493 19355 consensus_queue.cc:579] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.005s)
W20251024 08:16:06.219743 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:06.233310 19503 consensus_queue.cc:579] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.028s)
W20251024 08:16:06.238309 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:06.257164 19508 consensus_queue.cc:579] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.052s)
W20251024 08:16:06.259917 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:06.281332 19347 consensus_queue.cc:579] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.078s)
I20251024 08:16:06.286696 19502 consensus_queue.cc:579] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.082s)
W20251024 08:16:06.290522 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251024 08:16:06.291885 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:06.611922 19309 consensus_queue.cc:579] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.101s)
W20251024 08:16:06.616433 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251024 08:16:06.684547 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:06.727043 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:06.743769 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:06.804481 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:06.844955 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:07.158721 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:07.215981 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251024 08:16:07.223284 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 18761
W20251024 08:16:07.225327 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251024 08:16:07.230022 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.18.80.126:40485
--webserver_interface=127.18.80.126
--webserver_port=35279
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.18.80.126:40485 with env {}
W20251024 08:16:07.247975 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251024 08:16:07.287253 19242 heartbeater.cc:646] Failed to heartbeat to 127.18.80.126:40485 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.18.80.126:40485: connect: Connection refused (error 111)
W20251024 08:16:07.304702 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251024 08:16:07.319470 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251024 08:16:07.323916 19514 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:07.324261 19514 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:07.324342 19514 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:07.326586 19514 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251024 08:16:07.326736 19514 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:07.326809 19514 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251024 08:16:07.326838 19514 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251024 08:16:07.329613 19514 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.18.80.126:40485
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.18.80.126:40485
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.18.80.126
--webserver_port=35279
--never_fsync=true
--heap_profile_path=/tmp/kudu.19514
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:07.330163 19514 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:07.330595 19514 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:07.334180 19528 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:07.335225 19514 server_base.cc:1047] running on GCE node
W20251024 08:16:07.335366 19530 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:07.337356 19527 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:07.338157 19514 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:07.338466 19514 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:07.354817 19514 hybrid_clock.cc:648] HybridClock initialized: now 1761293767353716 us; error 1092 us; skew 500 ppm
I20251024 08:16:07.357434 19514 webserver.cc:492] Webserver started at http://127.18.80.126:35279/ using document root <none> and password file <none>
I20251024 08:16:07.357733 19514 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:07.357813 19514 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:07.360780 19514 fs_manager.cc:714] Time spent opening directory manager: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:07.369022 19536 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:07.369572 19514 fs_manager.cc:730] Time spent opening block manager: real 0.008s	user 0.000s	sys 0.001s
I20251024 08:16:07.369684 19514 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
uuid: "32f4918bb4bc4166aa635d60880ad40a"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:07.370070 19514 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:07.397353 19514 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:07.397855 19514 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:07.398262 19514 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:07.404389 19514 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.126:40485
I20251024 08:16:07.406096 19514 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
I20251024 08:16:07.404274 19588 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.126:40485 every 8 connection(s)
I20251024 08:16:07.415894 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 19514
I20251024 08:16:07.417825 19589 sys_catalog.cc:263] Verifying existing consensus state
I20251024 08:16:07.418718 19589 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Bootstrap starting.
I20251024 08:16:07.431985 19589 log.cc:826] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:07.435477 19589 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Bootstrap replayed 1/1 log segments. Stats: ops{read=14 overwritten=0 applied=14 ignored=0} inserts{seen=11 ignored=0} mutations{seen=13 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:07.435945 19589 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Bootstrap complete.
I20251024 08:16:07.440009 19589 raft_consensus.cc:359] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:07.440735 19589 raft_consensus.cc:740] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 32f4918bb4bc4166aa635d60880ad40a, State: Initialized, Role: FOLLOWER
I20251024 08:16:07.441010 19589 consensus_queue.cc:260] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14, Last appended: 1.14, Last appended by leader: 14, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:07.441121 19589 raft_consensus.cc:399] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251024 08:16:07.441190 19589 raft_consensus.cc:493] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251024 08:16:07.441255 19589 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 1 FOLLOWER]: Advancing to term 2
I20251024 08:16:07.442557 19589 raft_consensus.cc:515] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:07.442836 19589 leader_election.cc:304] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 32f4918bb4bc4166aa635d60880ad40a; no voters: 
I20251024 08:16:07.443786 19589 leader_election.cc:290] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [CANDIDATE]: Term 2 election: Requested vote from peers 
I20251024 08:16:07.443984 19593 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 2 FOLLOWER]: Leader election won for term 2
I20251024 08:16:07.444072 19589 sys_catalog.cc:565] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: configured and running, proceeding with master startup.
I20251024 08:16:07.444197 19593 raft_consensus.cc:697] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [term 2 LEADER]: Becoming Leader. State: Replica: 32f4918bb4bc4166aa635d60880ad40a, State: Running, Role: LEADER
I20251024 08:16:07.444342 19593 consensus_queue.cc:237] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14, Committed index: 14, Last appended: 1.14, Last appended by leader: 14, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } }
I20251024 08:16:07.444860 19593 sys_catalog.cc:455] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "32f4918bb4bc4166aa635d60880ad40a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } } }
I20251024 08:16:07.445053 19593 sys_catalog.cc:458] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: This master's current role is: LEADER
I20251024 08:16:07.445329 19593 sys_catalog.cc:455] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: SysCatalogTable state changed. Reason: New leader 32f4918bb4bc4166aa635d60880ad40a. Latest consensus state: current_term: 2 leader_uuid: "32f4918bb4bc4166aa635d60880ad40a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "32f4918bb4bc4166aa635d60880ad40a" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 40485 } } }
I20251024 08:16:07.445765 19604 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251024 08:16:07.446368 19604 catalog_manager.cc:679] Loaded metadata for table test-workload [id=03d3358e53e6427b8ad498da6c2e9168]
I20251024 08:16:07.446660 19604 tablet_loader.cc:96] loaded metadata for tablet 39d918845bd2430ea45706346c3a38d2 (table test-workload [id=03d3358e53e6427b8ad498da6c2e9168])
I20251024 08:16:07.446734 19604 tablet_loader.cc:96] loaded metadata for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 (table test-workload [id=03d3358e53e6427b8ad498da6c2e9168])
I20251024 08:16:07.445533 19593 sys_catalog.cc:458] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a [sys.catalog]: This master's current role is: LEADER
I20251024 08:16:07.447193 19604 tablet_loader.cc:96] loaded metadata for tablet 91d51fadf3f14a5897bf73a1492c5b29 (table test-workload [id=03d3358e53e6427b8ad498da6c2e9168])
I20251024 08:16:07.447266 19604 tablet_loader.cc:96] loaded metadata for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 (table test-workload [id=03d3358e53e6427b8ad498da6c2e9168])
I20251024 08:16:07.447302 19604 tablet_loader.cc:96] loaded metadata for tablet aa7ed2496d2143e0ac89caee8c72dbf4 (table test-workload [id=03d3358e53e6427b8ad498da6c2e9168])
I20251024 08:16:07.447338 19604 tablet_loader.cc:96] loaded metadata for tablet f897508fbb474c0eab2ead7fb82d2e0e (table test-workload [id=03d3358e53e6427b8ad498da6c2e9168])
I20251024 08:16:07.447381 19604 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251024 08:16:07.447495 19604 catalog_manager.cc:1269] Loaded cluster ID: 52bb1c2303b84fa3a260544ca2ff0d4e
I20251024 08:16:07.447525 19604 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251024 08:16:07.449630 19604 catalog_manager.cc:1514] Loading token signing keys...
I20251024 08:16:07.449928 19604 catalog_manager.cc:6033] T 00000000000000000000000000000000 P 32f4918bb4bc4166aa635d60880ad40a: Loaded TSK: 0
I20251024 08:16:07.450237 19604 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251024 08:16:07.623286 19551 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "1ca2674bda43456db1ddae2032441d86" instance_seqno: 1761293763687832) as {username='slave'} at 127.18.80.66:35335; Asking this server to re-register.
I20251024 08:16:07.623780 19111 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:07.623871 19111 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:07.624322 19551 ts_manager.cc:194] Registered new tserver with Master: 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
W20251024 08:16:07.679082 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251024 08:16:07.721000 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251024 08:16:07.762527 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251024 08:16:07.786440 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251024 08:16:07.854140 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251024 08:16:07.870462 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251024 08:16:07.943879 19351 consensus_queue.cc:799] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 6 ops behind the committed index 
I20251024 08:16:07.966542 19351 consensus_queue.cc:799] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 10 ops behind the committed index 
I20251024 08:16:07.969905 19265 consensus_queue.cc:799] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 17 ops behind the committed index 
I20251024 08:16:07.987440 19350 consensus_queue.cc:799] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 5 ops behind the committed index 
I20251024 08:16:07.987572 19519 consensus_queue.cc:799] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 18 ops behind the committed index 
I20251024 08:16:07.995914 19350 consensus_queue.cc:799] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 19 ops behind the committed index 
I20251024 08:16:08.153625 19551 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" instance_seqno: 1761293764116457) as {username='slave'} at 127.18.80.68:56117; Asking this server to re-register.
I20251024 08:16:08.153999 19493 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:08.154079 19493 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:08.154356 19551 ts_manager.cc:194] Registered new tserver with Master: 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609)
W20251024 08:16:08.165258 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251024 08:16:08.204515 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251024 08:16:08.288102 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
I20251024 08:16:08.290668 19242 heartbeater.cc:344] Connected to a master server at 127.18.80.126:40485
I20251024 08:16:08.291021 19551 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "124688609f6a4035893a031320ea1d52" instance_seqno: 1761293763799171) as {username='slave'} at 127.18.80.67:34491; Asking this server to re-register.
I20251024 08:16:08.291889 19242 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:08.291970 19242 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:08.292317 19551 ts_manager.cc:194] Registered new tserver with Master: 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
W20251024 08:16:08.313751 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251024 08:16:08.396905 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251024 08:16:08.414237 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251024 08:16:08.675422 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251024 08:16:08.677668 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251024 08:16:08.806804 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251024 08:16:08.812342 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251024 08:16:08.858033 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251024 08:16:08.978667 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251024 08:16:09.207365 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251024 08:16:09.217216 19131 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111) [suppressed 98 similar messages]
W20251024 08:16:09.217859 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251024 08:16:09.248620 19000 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111) [suppressed 194 similar messages]
W20251024 08:16:09.324543 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251024 08:16:09.342777 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251024 08:16:09.392294 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251024 08:16:09.482082 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251024 08:16:09.685024 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251024 08:16:09.712406 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251024 08:16:09.823711 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251024 08:16:09.829731 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251024 08:16:09.937417 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251024 08:16:09.996687 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251024 08:16:10.210068 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251024 08:16:10.236485 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251024 08:16:10.309942 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251024 08:16:10.412487 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20251024 08:16:10.423789 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:34683
--local_ip_for_outbound_sockets=127.18.80.65
--tserver_master_addrs=127.18.80.126:40485
--webserver_port=43829
--webserver_interface=127.18.80.65
--builtin_ntp_servers=127.18.80.84:33025
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251024 08:16:10.453665 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251024 08:16:10.478623 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251024 08:16:10.563786 19624 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:10.564006 19624 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:10.564035 19624 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251024 08:16:10.564060 19624 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:10.566286 19624 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:10.566434 19624 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:10.569666 19624 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:33025
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:34683
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=43829
--enable_log_gc=false
--tserver_master_addrs=127.18.80.126:40485
--never_fsync=true
--heap_profile_path=/tmp/kudu.19624
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:10.570148 19624 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:10.570571 19624 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:10.576399 19632 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:10.576700 19624 server_base.cc:1047] running on GCE node
W20251024 08:16:10.576870 19630 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:10.577262 19629 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:10.577544 19624 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:10.577809 19624 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:10.585568 19624 hybrid_clock.cc:648] HybridClock initialized: now 1761293770585526 us; error 48 us; skew 500 ppm
I20251024 08:16:10.587180 19624 webserver.cc:492] Webserver started at http://127.18.80.65:43829/ using document root <none> and password file <none>
I20251024 08:16:10.587421 19624 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:10.587499 19624 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:10.588984 19624 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:10.589993 19638 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:10.590126 19624 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:10.590185 19624 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "01590ebaf5524b53b67dbcd5ce628ab0"
format_stamp: "Formatted at 2025-10-24 08:16:03 on dist-test-slave-13l5"
I20251024 08:16:10.590484 19624 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:10.640969 19624 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:10.641480 19624 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:10.641652 19624 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:10.641916 19624 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:10.642762 19646 ts_tablet_manager.cc:542] Loading tablet metadata (0/6 complete)
I20251024 08:16:10.648145 19624 ts_tablet_manager.cc:585] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20251024 08:16:10.648308 19624 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.006s	user 0.000s	sys 0.000s
I20251024 08:16:10.648442 19624 ts_tablet_manager.cc:600] Registering tablets (0/6 complete)
I20251024 08:16:10.649477 19646 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:10.651057 19624 ts_tablet_manager.cc:616] Registered 6 tablets
I20251024 08:16:10.651140 19624 ts_tablet_manager.cc:595] Time spent register tablets: real 0.003s	user 0.003s	sys 0.000s
I20251024 08:16:10.661319 19624 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:34683
I20251024 08:16:10.661897 19624 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:10.663262 19753 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:34683 every 8 connection(s)
I20251024 08:16:10.664742 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 19624
I20251024 08:16:10.677982 19646 log.cc:826] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:10.699347 19754 heartbeater.cc:344] Connected to a master server at 127.18.80.126:40485
I20251024 08:16:10.699991 19754 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:10.700435 19754 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:10.704344 19551 ts_manager.cc:194] Registered new tserver with Master: 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683)
I20251024 08:16:10.705317 19551 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:52859
I20251024 08:16:10.742749 19646 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap replayed 1/1 log segments. Stats: ops{read=88 overwritten=0 applied=88 ignored=0} inserts{seen=739 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:10.743036 19646 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap complete.
I20251024 08:16:10.743722 19646 ts_tablet_manager.cc:1403] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.094s	user 0.012s	sys 0.009s
I20251024 08:16:10.745011 19646 raft_consensus.cc:359] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.745430 19646 raft_consensus.cc:740] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:10.746358 19646 consensus_queue.cc:260] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 88, Last appended: 1.88, Last appended by leader: 88, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.746872 19646 ts_tablet_manager.cc:1434] T 91d51fadf3f14a5897bf73a1492c5b29 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.003s	user 0.002s	sys 0.000s
I20251024 08:16:10.747218 19754 heartbeater.cc:499] Master 127.18.80.126:40485 was elected leader, sending a full tablet report...
I20251024 08:16:10.750627 19646 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
W20251024 08:16:10.767727 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
I20251024 08:16:10.795264 19646 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap replayed 1/1 log segments. Stats: ops{read=88 overwritten=0 applied=87 ignored=0} inserts{seen=706 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20251024 08:16:10.796741 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 61: this message will repeat every 5th retry.
I20251024 08:16:10.802109 19646 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap complete.
I20251024 08:16:10.803278 19646 ts_tablet_manager.cc:1403] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.053s	user 0.013s	sys 0.006s
I20251024 08:16:10.803503 19646 raft_consensus.cc:359] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.803797 19646 raft_consensus.cc:740] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:10.803903 19646 consensus_queue.cc:260] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 87, Last appended: 1.88, Last appended by leader: 88, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.804045 19646 ts_tablet_manager.cc:1434] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:10.804128 19646 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:10.833099 19646 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap replayed 1/1 log segments. Stats: ops{read=88 overwritten=0 applied=88 ignored=0} inserts{seen=732 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:10.833648 19646 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap complete.
I20251024 08:16:10.834662 19646 ts_tablet_manager.cc:1403] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.031s	user 0.009s	sys 0.010s
I20251024 08:16:10.834918 19646 raft_consensus.cc:359] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.835078 19646 raft_consensus.cc:740] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:10.835201 19646 consensus_queue.cc:260] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 88, Last appended: 1.88, Last appended by leader: 88, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.837112 19646 ts_tablet_manager.cc:1434] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.000s
I20251024 08:16:10.837668 19646 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
W20251024 08:16:10.862665 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
W20251024 08:16:10.920212 19755 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20251024 08:16:10.922312 19704 raft_consensus.cc:3060] T aa7ed2496d2143e0ac89caee8c72dbf4 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Advancing to term 2
I20251024 08:16:10.932703 19646 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap replayed 1/1 log segments. Stats: ops{read=86 overwritten=0 applied=86 ignored=0} inserts{seen=742 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:10.933043 19646 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap complete.
I20251024 08:16:10.933925 19646 ts_tablet_manager.cc:1403] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.096s	user 0.012s	sys 0.007s
I20251024 08:16:10.934087 19646 raft_consensus.cc:359] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.934144 19646 raft_consensus.cc:740] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:10.934187 19646 consensus_queue.cc:260] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 86, Last appended: 1.86, Last appended by leader: 86, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:10.934293 19646 ts_tablet_manager.cc:1434] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:10.934345 19646 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:10.958632 19619 consensus_queue.cc:799] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 1293 ops behind the committed index  [suppressed 29 similar messages]
W20251024 08:16:10.972148 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20251024 08:16:11.029059 19646 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap replayed 1/1 log segments. Stats: ops{read=87 overwritten=0 applied=87 ignored=0} inserts{seen=698 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:11.029512 19646 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap complete.
I20251024 08:16:11.035037 19521 consensus_queue.cc:799] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 1317 ops behind the committed index  [suppressed 29 similar messages]
I20251024 08:16:11.035984 19646 ts_tablet_manager.cc:1403] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.102s	user 0.016s	sys 0.005s
I20251024 08:16:11.036163 19646 raft_consensus.cc:359] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.036237 19646 raft_consensus.cc:740] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:11.036302 19646 consensus_queue.cc:260] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 87, Last appended: 1.87, Last appended by leader: 87, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.036417 19646 ts_tablet_manager.cc:1434] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:11.036471 19646 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap starting.
I20251024 08:16:11.197796 19646 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap replayed 1/1 log segments. Stats: ops{read=88 overwritten=0 applied=88 ignored=0} inserts{seen=699 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:11.201164 19646 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Bootstrap complete.
I20251024 08:16:11.213704 19646 ts_tablet_manager.cc:1403] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent bootstrapping tablet: real 0.177s	user 0.012s	sys 0.012s
I20251024 08:16:11.214784 19646 raft_consensus.cc:359] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.215272 19646 raft_consensus.cc:740] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 01590ebaf5524b53b67dbcd5ce628ab0, State: Initialized, Role: FOLLOWER
I20251024 08:16:11.215425 19646 consensus_queue.cc:260] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 88, Last appended: 1.88, Last appended by leader: 88, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.216538 19646 ts_tablet_manager.cc:1434] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.000s
I20251024 08:16:11.218891 19764 raft_consensus.cc:493] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 1ca2674bda43456db1ddae2032441d86)
I20251024 08:16:11.219009 19764 raft_consensus.cc:515] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.219399 19764 leader_election.cc:290] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
W20251024 08:16:11.240167 19782 log.cc:927] Time spent T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0: Append to log took a long time: real 0.081s	user 0.002s	sys 0.000s
I20251024 08:16:11.246918 19195 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "39d918845bd2430ea45706346c3a38d2" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1651 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:11.248287 19064 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "39d918845bd2430ea45706346c3a38d2" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1651 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:11.248555 19641 leader_election.cc:304] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0; no voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86
I20251024 08:16:11.248817 19784 raft_consensus.cc:2749] T 39d918845bd2430ea45706346c3a38d2 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251024 08:16:11.255000 19780 mvcc.cc:204] Tried to move back new op lower bound from 7214259273573720064 to 7214259259437711360. Current Snapshot: MvccSnapshot[applied={T|T < 7214259270246563840}]
I20251024 08:16:11.298610 19784 raft_consensus.cc:493] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 1ca2674bda43456db1ddae2032441d86)
I20251024 08:16:11.298707 19784 raft_consensus.cc:515] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.298849 19784 leader_election.cc:290] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:11.301973 19064 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e9dd8c4736b4a428f5ed6b8e81d2d81" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1650 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:11.302794 19194 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9e9dd8c4736b4a428f5ed6b8e81d2d81" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1650 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:11.303035 19640 leader_election.cc:304] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0; no voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86
I20251024 08:16:11.303176 19784 raft_consensus.cc:2749] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251024 08:16:11.380347 19784 raft_consensus.cc:493] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 1ca2674bda43456db1ddae2032441d86)
I20251024 08:16:11.380450 19784 raft_consensus.cc:515] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.380589 19784 leader_election.cc:290] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:11.381007 19063 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1647 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:11.381253 19196 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "f897508fbb474c0eab2ead7fb82d2e0e" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1647 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:11.381433 19640 leader_election.cc:304] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0; no voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86
I20251024 08:16:11.381570 19784 raft_consensus.cc:2749] T f897508fbb474c0eab2ead7fb82d2e0e P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251024 08:16:11.560448 19784 raft_consensus.cc:493] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 124688609f6a4035893a031320ea1d52)
I20251024 08:16:11.560547 19784 raft_consensus.cc:515] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } }
I20251024 08:16:11.560763 19784 leader_election.cc:290] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791), 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:11.562212 19063 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "90fe2a8a32ae4fd3a0574ccd6f5ec935" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1650 } } ignore_live_leader: false dest_uuid: "1ca2674bda43456db1ddae2032441d86" is_pre_election: true
I20251024 08:16:11.562566 19194 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "90fe2a8a32ae4fd3a0574ccd6f5ec935" candidate_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" candidate_term: 2 candidate_status { last_received { term: 1 index: 1650 } } ignore_live_leader: false dest_uuid: "124688609f6a4035893a031320ea1d52" is_pre_election: true
I20251024 08:16:11.562816 19640 leader_election.cc:304] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 01590ebaf5524b53b67dbcd5ce628ab0; no voters: 124688609f6a4035893a031320ea1d52, 1ca2674bda43456db1ddae2032441d86
I20251024 08:16:11.562978 19784 raft_consensus.cc:2749] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 01590ebaf5524b53b67dbcd5ce628ab0 [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251024 08:16:11.707870 19551 ts_manager.cc:284] Unset tserver state for 01590ebaf5524b53b67dbcd5ce628ab0 from MAINTENANCE_MODE
I20251024 08:16:12.162484 19493 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:12.481945 19111 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:12.567737 19754 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:12.626672 19242 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:14.771448 19551 ts_manager.cc:295] Set tserver state for 01590ebaf5524b53b67dbcd5ce628ab0 to MAINTENANCE_MODE
I20251024 08:16:14.771845 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19624
W20251024 08:16:14.792173 19131 connection.cc:537] client connection to 127.18.80.65:34683 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:14.792265 19131 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 27 similar messages]
W20251024 08:16:14.793031 19000 connection.cc:537] client connection to 127.18.80.65:34683 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:14.793190 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:14.793216 19000 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 55 similar messages]
W20251024 08:16:14.793237 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:14.794466 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:14.794548 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:14.794582 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:14.794656 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:15.255591 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:15.261731 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:15.270145 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:15.287909 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:15.302966 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:15.310837 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251024 08:16:15.676383 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:15.752159 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:15.804287 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:15.811218 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:15.815407 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:15.816380 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251024 08:16:16.249799 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:16.278638 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:16.303802 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:16.328133 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:16.339819 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:16.365401 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251024 08:16:16.713892 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:16.782069 19841 consensus_queue.cc:579] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.011s)
I20251024 08:16:16.788345 19803 consensus_queue.cc:579] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.019s)
I20251024 08:16:16.790050 19806 consensus_queue.cc:579] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.021s)
W20251024 08:16:16.795167 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:16.803157 19803 consensus_queue.cc:579] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.034s)
I20251024 08:16:16.804919 19838 consensus_queue.cc:579] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.036s)
W20251024 08:16:16.810796 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:16.842851 19823 consensus_queue.cc:579] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Leader has been unable to successfully communicate with peer 01590ebaf5524b53b67dbcd5ce628ab0 for more than 2 seconds (2.072s)
W20251024 08:16:16.845881 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251024 08:16:16.874085 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251024 08:16:16.916224 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251024 08:16:16.921530 19551 ts_manager.cc:284] Unset tserver state for 01590ebaf5524b53b67dbcd5ce628ab0 from MAINTENANCE_MODE
I20251024 08:16:17.167449 19493 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
W20251024 08:16:17.219027 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:17.334024 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:17.335788 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:17.351459 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:17.367442 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:17.379487 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251024 08:16:17.742215 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251024 08:16:17.784334 19810 consensus_queue.cc:799] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 2 ops behind the committed index  [suppressed 29 similar messages]
I20251024 08:16:17.786177 19803 consensus_queue.cc:799] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 13 ops behind the committed index 
I20251024 08:16:17.786412 19812 consensus_queue.cc:799] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 17 ops behind the committed index  [suppressed 2 similar messages]
I20251024 08:16:17.810443 19111 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:17.816685 19820 consensus_queue.cc:799] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Peer 01590ebaf5524b53b67dbcd5ce628ab0 is lagging by at least 37 ops behind the committed index  [suppressed 27 similar messages]
I20251024 08:16:17.823475 19062 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 27 messages since previous log ~9 seconds ago
I20251024 08:16:17.823644 19065 consensus_queue.cc:237] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5862, Committed index: 5862, Last appended: 1.5864, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5865 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } }
I20251024 08:16:17.823644 19063 consensus_queue.cc:237] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5860, Committed index: 5860, Last appended: 1.5863, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } }
I20251024 08:16:17.823685 19062 consensus_queue.cc:237] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5862, Committed index: 5862, Last appended: 1.5863, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } }
I20251024 08:16:17.824445 19063 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 28 messages since previous log ~9 seconds ago
W20251024 08:16:17.824559 19000 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:17.824595 19000 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:17.824626 19063 consensus_queue.cc:237] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5863, Committed index: 5863, Last appended: 2.5865, Last appended by leader: 88, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5866 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } }
W20251024 08:16:17.825661 19000 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:17.825711 19000 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:17.829022 18998 consensus_peers.cc:597] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 -> Peer 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609): Couldn't send request to peer 2aa616b606514919b4009cac3bee5b9e. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: f897508fbb474c0eab2ead7fb82d2e0e. This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:17.829257 19191 raft_consensus.cc:1275] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 1 index: 5862. Preceding OpId from leader: term: 1 index: 5864. (index mismatch)
W20251024 08:16:17.829358 18998 consensus_peers.cc:597] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 -> Peer 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609): Couldn't send request to peer 2aa616b606514919b4009cac3bee5b9e. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 9e9dd8c4736b4a428f5ed6b8e81d2d81. This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:17.829413 18998 consensus_peers.cc:597] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 -> Peer 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609): Couldn't send request to peer 2aa616b606514919b4009cac3bee5b9e. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 39d918845bd2430ea45706346c3a38d2. This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:17.829427 19193 raft_consensus.cc:1275] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 1 index: 5861. Preceding OpId from leader: term: 1 index: 5864. (index mismatch)
W20251024 08:16:17.829454 18998 consensus_peers.cc:597] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 -> Peer 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609): Couldn't send request to peer 2aa616b606514919b4009cac3bee5b9e. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: aa7ed2496d2143e0ac89caee8c72dbf4. This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:17.829831 19192 raft_consensus.cc:1275] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 1 index: 5863. Preceding OpId from leader: term: 1 index: 5865. (index mismatch)
I20251024 08:16:17.829924 19803 consensus_queue.cc:1048] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5864, Last known committed idx: 5860, Time since last communication: 0.000s
I20251024 08:16:17.830036 19803 consensus_queue.cc:1048] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5864, Last known committed idx: 5860, Time since last communication: 0.000s
I20251024 08:16:17.830132 19809 consensus_queue.cc:1048] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5865, Last known committed idx: 5862, Time since last communication: 0.000s
I20251024 08:16:17.830220 19192 raft_consensus.cc:1217] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Deduplicated request from leader. Original: 1.5862->[1.5863-1.5865]   Dedup: 1.5863->[1.5864-1.5865]
I20251024 08:16:17.830444 19191 raft_consensus.cc:1275] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 2 FOLLOWER]: Refusing update from remote peer 1ca2674bda43456db1ddae2032441d86: Log matching property violated. Preceding OpId in replica: term: 2 index: 5865. Preceding OpId from leader: term: 2 index: 5866. (index mismatch)
I20251024 08:16:17.830826 19809 consensus_queue.cc:1048] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [LEADER]: Connected to new peer: Peer: permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5866, Last known committed idx: 5863, Time since last communication: 0.000s
I20251024 08:16:17.831148 19810 raft_consensus.cc:2955] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86 [term 1 LEADER]: Committing config change with OpId 1.5864: config changed from index -1 to 5864, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.831292 19192 raft_consensus.cc:2955] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Committing config change with OpId 1.5864: config changed from index -1 to 5864, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.831681 19824 raft_consensus.cc:2955] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86 [term 1 LEADER]: Committing config change with OpId 1.5864: config changed from index -1 to 5864, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.831409 19840 raft_consensus.cc:2955] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86 [term 1 LEADER]: Committing config change with OpId 1.5865: config changed from index -1 to 5865, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5865 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.833151 19539 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 39d918845bd2430ea45706346c3a38d2 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251024 08:16:17.833386 19551 catalog_manager.cc:5649] T 39d918845bd2430ea45706346c3a38d2 P 124688609f6a4035893a031320ea1d52 reported cstate change: config changed from index -1 to 5864, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New cstate: current_term: 1 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.833658 19539 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251024 08:16:17.834313 19539 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet f897508fbb474c0eab2ead7fb82d2e0e with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251024 08:16:17.834620 19191 raft_consensus.cc:2955] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Committing config change with OpId 1.5864: config changed from index -1 to 5864, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.835132 19194 raft_consensus.cc:2955] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 [term 1 FOLLOWER]: Committing config change with OpId 1.5865: config changed from index -1 to 5865, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5865 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.836285 19242 heartbeater.cc:507] Master 127.18.80.126:40485 requested a full tablet report, sending...
I20251024 08:16:17.836640 19552 catalog_manager.cc:5649] T f897508fbb474c0eab2ead7fb82d2e0e P 124688609f6a4035893a031320ea1d52 reported cstate change: config changed from index -1 to 5865, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New cstate: current_term: 1 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: 5865 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.836787 19552 catalog_manager.cc:5649] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 124688609f6a4035893a031320ea1d52 reported cstate change: config changed from index -1 to 5864, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New cstate: current_term: 1 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: 5864 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.836907 19809 raft_consensus.cc:2955] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 [term 2 LEADER]: Committing config change with OpId 2.5866: config changed from index -1 to 5866, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5866 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.838245 19191 raft_consensus.cc:2955] T aa7ed2496d2143e0ac89caee8c72dbf4 P 124688609f6a4035893a031320ea1d52 [term 2 FOLLOWER]: Committing config change with OpId 2.5866: config changed from index -1 to 5866, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5866 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.838366 19539 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet aa7ed2496d2143e0ac89caee8c72dbf4 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251024 08:16:17.840870 19551 catalog_manager.cc:5649] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86 reported cstate change: config changed from index -1 to 5866, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New cstate: current_term: 2 leader_uuid: "1ca2674bda43456db1ddae2032441d86" committed_config { opid_index: 5866 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:17.845666 19196 consensus_queue.cc:237] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5870, Committed index: 5870, Last appended: 1.5870, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } }
I20251024 08:16:17.845769 19191 consensus_queue.cc:237] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5870, Committed index: 5870, Last appended: 1.5870, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } }
I20251024 08:16:17.846889 19063 raft_consensus.cc:1275] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Refusing update from remote peer 124688609f6a4035893a031320ea1d52: Log matching property violated. Preceding OpId in replica: term: 1 index: 5870. Preceding OpId from leader: term: 1 index: 5871. (index mismatch)
I20251024 08:16:17.846889 19062 raft_consensus.cc:1275] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Refusing update from remote peer 124688609f6a4035893a031320ea1d52: Log matching property violated. Preceding OpId in replica: term: 1 index: 5870. Preceding OpId from leader: term: 1 index: 5871. (index mismatch)
I20251024 08:16:17.847779 19841 consensus_queue.cc:1048] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [LEADER]: Connected to new peer: Peer: permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5871, Last known committed idx: 5870, Time since last communication: 0.000s
I20251024 08:16:17.847892 19841 consensus_queue.cc:1048] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [LEADER]: Connected to new peer: Peer: permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5871, Last known committed idx: 5870, Time since last communication: 0.000s
I20251024 08:16:17.849116 19842 raft_consensus.cc:2955] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 [term 1 LEADER]: Committing config change with OpId 1.5871: config changed from index -1 to 5871, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.849530 19063 raft_consensus.cc:2955] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Committing config change with OpId 1.5871: config changed from index -1 to 5871, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.849531 19062 raft_consensus.cc:2955] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 1ca2674bda43456db1ddae2032441d86 [term 1 FOLLOWER]: Committing config change with OpId 1.5871: config changed from index -1 to 5871, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.850174 19538 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251024 08:16:17.849161 19841 raft_consensus.cc:2955] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 [term 1 LEADER]: Committing config change with OpId 1.5871: config changed from index -1 to 5871, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New config: { opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.850530 19552 catalog_manager.cc:5649] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 reported cstate change: config changed from index -1 to 5871, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New cstate: current_term: 1 leader_uuid: "124688609f6a4035893a031320ea1d52" committed_config { opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20251024 08:16:17.850806 19131 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:17.850871 19131 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 01590ebaf5524b53b67dbcd5ce628ab0 (127.18.80.65:34683): Couldn't send request to peer 01590ebaf5524b53b67dbcd5ce628ab0. Status: Network error: Client connection negotiation failed: client connection to 127.18.80.65:34683: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:17.851277 19551 catalog_manager.cc:5649] T 91d51fadf3f14a5897bf73a1492c5b29 P 1ca2674bda43456db1ddae2032441d86 reported cstate change: config changed from index -1 to 5871, NON_VOTER 2aa616b606514919b4009cac3bee5b9e (127.18.80.68) added. New cstate: current_term: 1 leader_uuid: "124688609f6a4035893a031320ea1d52" committed_config { opid_index: 5871 OBSOLETE_local: false peers { permanent_uuid: "01590ebaf5524b53b67dbcd5ce628ab0" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 34683 } } peers { permanent_uuid: "1ca2674bda43456db1ddae2032441d86" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 44791 } } peers { permanent_uuid: "124688609f6a4035893a031320ea1d52" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 45549 } } peers { permanent_uuid: "2aa616b606514919b4009cac3bee5b9e" member_type: NON_VOTER last_known_addr { host: "127.18.80.68" port: 35609 } attrs { promote: true } } }
I20251024 08:16:17.851644 19538 catalog_manager.cc:5162] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 91d51fadf3f14a5897bf73a1492c5b29 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
W20251024 08:16:17.853291 19129 consensus_peers.cc:597] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52 -> Peer 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609): Couldn't send request to peer 2aa616b606514919b4009cac3bee5b9e. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 91d51fadf3f14a5897bf73a1492c5b29. This is attempt 1: this message will repeat every 5th retry.
W20251024 08:16:17.853353 19129 consensus_peers.cc:597] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52 -> Peer 2aa616b606514919b4009cac3bee5b9e (127.18.80.68:35609): Couldn't send request to peer 2aa616b606514919b4009cac3bee5b9e. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 90fe2a8a32ae4fd3a0574ccd6f5ec935. This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:17.908044 19860 ts_tablet_manager.cc:933] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: Initiating tablet copy from peer 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:17.908473 19860 tablet_copy_client.cc:323] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Beginning tablet copy session from remote peer at address 127.18.80.66:44791
I20251024 08:16:17.917374 19863 ts_tablet_manager.cc:933] T 39d918845bd2430ea45706346c3a38d2 P 2aa616b606514919b4009cac3bee5b9e: Initiating tablet copy from peer 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:17.917707 19863 tablet_copy_client.cc:323] T 39d918845bd2430ea45706346c3a38d2 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Beginning tablet copy session from remote peer at address 127.18.80.66:44791
I20251024 08:16:17.921393 19085 tablet_copy_service.cc:140] P 1ca2674bda43456db1ddae2032441d86: Received BeginTabletCopySession request for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 from peer 2aa616b606514919b4009cac3bee5b9e ({username='slave'} at 127.18.80.68:53713)
I20251024 08:16:17.921485 19085 tablet_copy_service.cc:161] P 1ca2674bda43456db1ddae2032441d86: Beginning new tablet copy session on tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81 from peer 2aa616b606514919b4009cac3bee5b9e at {username='slave'} at 127.18.80.68:53713: session id = 2aa616b606514919b4009cac3bee5b9e-9e9dd8c4736b4a428f5ed6b8e81d2d81
I20251024 08:16:17.922207 19085 tablet_copy_source_session.cc:215] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 1ca2674bda43456db1ddae2032441d86: Tablet Copy: opened 0 blocks and 1 log segments
I20251024 08:16:17.922458 19084 tablet_copy_service.cc:140] P 1ca2674bda43456db1ddae2032441d86: Received BeginTabletCopySession request for tablet 39d918845bd2430ea45706346c3a38d2 from peer 2aa616b606514919b4009cac3bee5b9e ({username='slave'} at 127.18.80.68:53713)
I20251024 08:16:17.922514 19084 tablet_copy_service.cc:161] P 1ca2674bda43456db1ddae2032441d86: Beginning new tablet copy session on tablet 39d918845bd2430ea45706346c3a38d2 from peer 2aa616b606514919b4009cac3bee5b9e at {username='slave'} at 127.18.80.68:53713: session id = 2aa616b606514919b4009cac3bee5b9e-39d918845bd2430ea45706346c3a38d2
I20251024 08:16:17.922883 19084 tablet_copy_source_session.cc:215] T 39d918845bd2430ea45706346c3a38d2 P 1ca2674bda43456db1ddae2032441d86: Tablet Copy: opened 0 blocks and 1 log segments
I20251024 08:16:17.923184 19860 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9e9dd8c4736b4a428f5ed6b8e81d2d81. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:17.923525 19863 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 39d918845bd2430ea45706346c3a38d2. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:17.925884 19860 tablet_copy_client.cc:806] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 0 data blocks...
I20251024 08:16:17.926030 19860 tablet_copy_client.cc:670] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 1 WAL segments...
I20251024 08:16:17.926401 19863 tablet_copy_client.cc:806] T 39d918845bd2430ea45706346c3a38d2 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 0 data blocks...
I20251024 08:16:17.926517 19863 tablet_copy_client.cc:670] T 39d918845bd2430ea45706346c3a38d2 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 1 WAL segments...
I20251024 08:16:17.936702 19865 ts_tablet_manager.cc:933] T 91d51fadf3f14a5897bf73a1492c5b29 P 2aa616b606514919b4009cac3bee5b9e: Initiating tablet copy from peer 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:17.958880 19865 tablet_copy_client.cc:323] T 91d51fadf3f14a5897bf73a1492c5b29 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Beginning tablet copy session from remote peer at address 127.18.80.67:45549
I20251024 08:16:17.960242 19869 ts_tablet_manager.cc:933] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 2aa616b606514919b4009cac3bee5b9e: Initiating tablet copy from peer 124688609f6a4035893a031320ea1d52 (127.18.80.67:45549)
I20251024 08:16:17.960492 19869 tablet_copy_client.cc:323] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Beginning tablet copy session from remote peer at address 127.18.80.67:45549
I20251024 08:16:17.966066 19215 tablet_copy_service.cc:140] P 124688609f6a4035893a031320ea1d52: Received BeginTabletCopySession request for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 from peer 2aa616b606514919b4009cac3bee5b9e ({username='slave'} at 127.18.80.68:47683)
I20251024 08:16:17.966066 19216 tablet_copy_service.cc:140] P 124688609f6a4035893a031320ea1d52: Received BeginTabletCopySession request for tablet 91d51fadf3f14a5897bf73a1492c5b29 from peer 2aa616b606514919b4009cac3bee5b9e ({username='slave'} at 127.18.80.68:47683)
I20251024 08:16:17.966156 19215 tablet_copy_service.cc:161] P 124688609f6a4035893a031320ea1d52: Beginning new tablet copy session on tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935 from peer 2aa616b606514919b4009cac3bee5b9e at {username='slave'} at 127.18.80.68:47683: session id = 2aa616b606514919b4009cac3bee5b9e-90fe2a8a32ae4fd3a0574ccd6f5ec935
I20251024 08:16:17.966365 19216 tablet_copy_service.cc:161] P 124688609f6a4035893a031320ea1d52: Beginning new tablet copy session on tablet 91d51fadf3f14a5897bf73a1492c5b29 from peer 2aa616b606514919b4009cac3bee5b9e at {username='slave'} at 127.18.80.68:47683: session id = 2aa616b606514919b4009cac3bee5b9e-91d51fadf3f14a5897bf73a1492c5b29
I20251024 08:16:17.966827 19216 tablet_copy_source_session.cc:215] T 91d51fadf3f14a5897bf73a1492c5b29 P 124688609f6a4035893a031320ea1d52: Tablet Copy: opened 0 blocks and 1 log segments
I20251024 08:16:17.967356 19865 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 91d51fadf3f14a5897bf73a1492c5b29. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:17.966827 19215 tablet_copy_source_session.cc:215] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 124688609f6a4035893a031320ea1d52: Tablet Copy: opened 0 blocks and 1 log segments
I20251024 08:16:17.970016 19865 tablet_copy_client.cc:806] T 91d51fadf3f14a5897bf73a1492c5b29 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 0 data blocks...
I20251024 08:16:17.970275 19865 tablet_copy_client.cc:670] T 91d51fadf3f14a5897bf73a1492c5b29 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 1 WAL segments...
I20251024 08:16:17.970578 19869 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 90fe2a8a32ae4fd3a0574ccd6f5ec935. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:17.972069 19869 tablet_copy_client.cc:806] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 0 data blocks...
I20251024 08:16:17.972252 19869 tablet_copy_client.cc:670] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 1 WAL segments...
I20251024 08:16:17.942649 19867 ts_tablet_manager.cc:933] T f897508fbb474c0eab2ead7fb82d2e0e P 2aa616b606514919b4009cac3bee5b9e: Initiating tablet copy from peer 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:17.972857 19867 tablet_copy_client.cc:323] T f897508fbb474c0eab2ead7fb82d2e0e P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Beginning tablet copy session from remote peer at address 127.18.80.66:44791
I20251024 08:16:17.951143 19863 tablet_copy_client.cc:538] T 39d918845bd2430ea45706346c3a38d2 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251024 08:16:17.977941 19863 tablet_bootstrap.cc:492] T 39d918845bd2430ea45706346c3a38d2 P 2aa616b606514919b4009cac3bee5b9e: Bootstrap starting.
I20251024 08:16:17.940301 19866 ts_tablet_manager.cc:933] T aa7ed2496d2143e0ac89caee8c72dbf4 P 2aa616b606514919b4009cac3bee5b9e: Initiating tablet copy from peer 1ca2674bda43456db1ddae2032441d86 (127.18.80.66:44791)
I20251024 08:16:17.959041 19860 tablet_copy_client.cc:538] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251024 08:16:17.983359 19860 tablet_bootstrap.cc:492] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: Bootstrap starting.
I20251024 08:16:17.985021 19085 tablet_copy_service.cc:140] P 1ca2674bda43456db1ddae2032441d86: Received BeginTabletCopySession request for tablet f897508fbb474c0eab2ead7fb82d2e0e from peer 2aa616b606514919b4009cac3bee5b9e ({username='slave'} at 127.18.80.68:53713)
I20251024 08:16:17.985080 19085 tablet_copy_service.cc:161] P 1ca2674bda43456db1ddae2032441d86: Beginning new tablet copy session on tablet f897508fbb474c0eab2ead7fb82d2e0e from peer 2aa616b606514919b4009cac3bee5b9e at {username='slave'} at 127.18.80.68:53713: session id = 2aa616b606514919b4009cac3bee5b9e-f897508fbb474c0eab2ead7fb82d2e0e
I20251024 08:16:17.985613 19085 tablet_copy_source_session.cc:215] T f897508fbb474c0eab2ead7fb82d2e0e P 1ca2674bda43456db1ddae2032441d86: Tablet Copy: opened 0 blocks and 1 log segments
I20251024 08:16:17.989257 19866 tablet_copy_client.cc:323] T aa7ed2496d2143e0ac89caee8c72dbf4 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Beginning tablet copy session from remote peer at address 127.18.80.66:44791
I20251024 08:16:17.989516 19085 tablet_copy_service.cc:140] P 1ca2674bda43456db1ddae2032441d86: Received BeginTabletCopySession request for tablet aa7ed2496d2143e0ac89caee8c72dbf4 from peer 2aa616b606514919b4009cac3bee5b9e ({username='slave'} at 127.18.80.68:53713)
I20251024 08:16:17.989575 19085 tablet_copy_service.cc:161] P 1ca2674bda43456db1ddae2032441d86: Beginning new tablet copy session on tablet aa7ed2496d2143e0ac89caee8c72dbf4 from peer 2aa616b606514919b4009cac3bee5b9e at {username='slave'} at 127.18.80.68:53713: session id = 2aa616b606514919b4009cac3bee5b9e-aa7ed2496d2143e0ac89caee8c72dbf4
I20251024 08:16:17.990088 19085 tablet_copy_source_session.cc:215] T aa7ed2496d2143e0ac89caee8c72dbf4 P 1ca2674bda43456db1ddae2032441d86: Tablet Copy: opened 0 blocks and 1 log segments
I20251024 08:16:17.993044 19866 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet aa7ed2496d2143e0ac89caee8c72dbf4. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:17.994449 19867 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f897508fbb474c0eab2ead7fb82d2e0e. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:17.994874 19866 tablet_copy_client.cc:806] T aa7ed2496d2143e0ac89caee8c72dbf4 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 0 data blocks...
I20251024 08:16:17.995050 19866 tablet_copy_client.cc:670] T aa7ed2496d2143e0ac89caee8c72dbf4 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 1 WAL segments...
I20251024 08:16:17.995612 19867 tablet_copy_client.cc:806] T f897508fbb474c0eab2ead7fb82d2e0e P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 0 data blocks...
I20251024 08:16:17.995718 19867 tablet_copy_client.cc:670] T f897508fbb474c0eab2ead7fb82d2e0e P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Starting download of 1 WAL segments...
I20251024 08:16:17.998179 19865 tablet_copy_client.cc:538] T 91d51fadf3f14a5897bf73a1492c5b29 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251024 08:16:17.998988 19865 tablet_bootstrap.cc:492] T 91d51fadf3f14a5897bf73a1492c5b29 P 2aa616b606514919b4009cac3bee5b9e: Bootstrap starting.
I20251024 08:16:18.010537 19869 tablet_copy_client.cc:538] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251024 08:16:18.011812 19869 tablet_bootstrap.cc:492] T 90fe2a8a32ae4fd3a0574ccd6f5ec935 P 2aa616b606514919b4009cac3bee5b9e: Bootstrap starting.
I20251024 08:16:18.017850 19866 tablet_copy_client.cc:538] T aa7ed2496d2143e0ac89caee8c72dbf4 P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251024 08:16:18.018743 19866 tablet_bootstrap.cc:492] T aa7ed2496d2143e0ac89caee8c72dbf4 P 2aa616b606514919b4009cac3bee5b9e: Bootstrap starting.
I20251024 08:16:18.022550 19867 tablet_copy_client.cc:538] T f897508fbb474c0eab2ead7fb82d2e0e P 2aa616b606514919b4009cac3bee5b9e: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251024 08:16:18.023427 19867 tablet_bootstrap.cc:492] T f897508fbb474c0eab2ead7fb82d2e0e P 2aa616b606514919b4009cac3bee5b9e: Bootstrap starting.
I20251024 08:16:18.130188 19860 log.cc:826] T 9e9dd8c4736b4a428f5ed6b8e81d2d81 P 2aa616b606514919b4009cac3bee5b9e: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:18.210440 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 18983
I20251024 08:16:18.230998 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19114
I20251024 08:16:18.257038 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19302
I20251024 08:16:18.262259 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19514
2025-10-24T08:16:18Z chronyd exiting
[       OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (14999 ms)
[----------] 1 test from MaintenanceModeRF3ITest (14999 ms total)

[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN      ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2025-10-24T08:16:18Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-10-24T08:16:18Z Disabled control of system clock
I20251024 08:16:18.310520 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.18.80.126:45025
--webserver_interface=127.18.80.126
--webserver_port=0
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.18.80.126:45025
--location_mapping_cmd=/tmp/dist-test-taskNzynA4/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20251024 08:16:18.386338 19888 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:18.386520 19888 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:18.386538 19888 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:18.387796 19888 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251024 08:16:18.387836 19888 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:18.387848 19888 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251024 08:16:18.387861 19888 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251024 08:16:18.389312 19888 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-taskNzynA4/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.18.80.126:45025
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.18.80.126:45025
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.18.80.126
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.19888
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:18.389509 19888 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:18.389712 19888 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20251024 08:16:18.392338 19888 server_base.cc:1047] running on GCE node
W20251024 08:16:18.392314 19896 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.392323 19894 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.392472 19893 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.392817 19888 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:18.393100 19888 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:18.394225 19888 hybrid_clock.cc:648] HybridClock initialized: now 1761293778394211 us; error 35 us; skew 500 ppm
I20251024 08:16:18.395308 19888 webserver.cc:492] Webserver started at http://127.18.80.126:35863/ using document root <none> and password file <none>
I20251024 08:16:18.395496 19888 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:18.395534 19888 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:18.395639 19888 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:18.396492 19888 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data/instance:
uuid: "08e81818520c41e7807c6891a15a43d4"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.396790 19888 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal/instance:
uuid: "08e81818520c41e7807c6891a15a43d4"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.398002 19888 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:18.398679 19902 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.398833 19888 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251024 08:16:18.398901 19888 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal
uuid: "08e81818520c41e7807c6891a15a43d4"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.398955 19888 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:18.416319 19888 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:18.416646 19888 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:18.416785 19888 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:18.420872 19888 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.126:45025
I20251024 08:16:18.420931 19954 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.126:45025 every 8 connection(s)
I20251024 08:16:18.421259 19888 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/data/info.pb
I20251024 08:16:18.421838 19955 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:18.424108 19955 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4: Bootstrap starting.
I20251024 08:16:18.424427 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 19888
I20251024 08:16:18.424510 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/master-0/wal/instance
I20251024 08:16:18.424747 19955 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:18.425071 19955 log.cc:826] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:18.425784 19955 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4: No bootstrap required, opened a new log
I20251024 08:16:18.426942 19955 raft_consensus.cc:359] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "08e81818520c41e7807c6891a15a43d4" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 45025 } }
I20251024 08:16:18.427059 19955 raft_consensus.cc:385] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:18.427089 19955 raft_consensus.cc:740] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 08e81818520c41e7807c6891a15a43d4, State: Initialized, Role: FOLLOWER
I20251024 08:16:18.427181 19955 consensus_queue.cc:260] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "08e81818520c41e7807c6891a15a43d4" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 45025 } }
I20251024 08:16:18.427248 19955 raft_consensus.cc:399] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251024 08:16:18.427286 19955 raft_consensus.cc:493] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251024 08:16:18.427331 19955 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:18.427839 19955 raft_consensus.cc:515] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "08e81818520c41e7807c6891a15a43d4" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 45025 } }
I20251024 08:16:18.427950 19955 leader_election.cc:304] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 08e81818520c41e7807c6891a15a43d4; no voters: 
I20251024 08:16:18.428139 19955 leader_election.cc:290] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20251024 08:16:18.428213 19960 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:18.428357 19955 sys_catalog.cc:565] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [sys.catalog]: configured and running, proceeding with master startup.
I20251024 08:16:18.428525 19960 raft_consensus.cc:697] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [term 1 LEADER]: Becoming Leader. State: Replica: 08e81818520c41e7807c6891a15a43d4, State: Running, Role: LEADER
I20251024 08:16:18.428668 19960 consensus_queue.cc:237] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "08e81818520c41e7807c6891a15a43d4" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 45025 } }
I20251024 08:16:18.429201 19960 sys_catalog.cc:455] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 08e81818520c41e7807c6891a15a43d4. Latest consensus state: current_term: 1 leader_uuid: "08e81818520c41e7807c6891a15a43d4" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "08e81818520c41e7807c6891a15a43d4" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 45025 } } }
I20251024 08:16:18.429299 19960 sys_catalog.cc:458] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [sys.catalog]: This master's current role is: LEADER
I20251024 08:16:18.429548 19961 sys_catalog.cc:455] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "08e81818520c41e7807c6891a15a43d4" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "08e81818520c41e7807c6891a15a43d4" member_type: VOTER last_known_addr { host: "127.18.80.126" port: 45025 } } }
I20251024 08:16:18.429653 19961 sys_catalog.cc:458] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4 [sys.catalog]: This master's current role is: LEADER
I20251024 08:16:18.429888 19972 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251024 08:16:18.430380 19972 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251024 08:16:18.432111 19972 catalog_manager.cc:1357] Generated new cluster ID: bcc33bc06fed469e9f71e6ba14d219e5
I20251024 08:16:18.432155 19972 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251024 08:16:18.441509 19972 catalog_manager.cc:1380] Generated new certificate authority record
I20251024 08:16:18.442317 19972 catalog_manager.cc:1514] Loading token signing keys...
I20251024 08:16:18.449456 19972 catalog_manager.cc:6022] T 00000000000000000000000000000000 P 08e81818520c41e7807c6891a15a43d4: Generated new TSK 0
I20251024 08:16:18.449688 19972 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251024 08:16:18.455897 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:0
--local_ip_for_outbound_sockets=127.18.80.65
--webserver_interface=127.18.80.65
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:18.534510 19979 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:18.534689 19979 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:18.534718 19979 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:18.536038 19979 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:18.536098 19979 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:18.537555 19979 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.19979
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:18.537792 19979 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:18.538004 19979 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:18.540452 19984 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.540486 19985 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.540571 19987 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.540634 19979 server_base.cc:1047] running on GCE node
I20251024 08:16:18.540978 19979 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:18.541198 19979 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:18.542330 19979 hybrid_clock.cc:648] HybridClock initialized: now 1761293778542318 us; error 31 us; skew 500 ppm
I20251024 08:16:18.543393 19979 webserver.cc:492] Webserver started at http://127.18.80.65:40139/ using document root <none> and password file <none>
I20251024 08:16:18.543612 19979 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:18.543658 19979 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:18.543766 19979 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:18.544623 19979 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/instance:
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.544940 19979 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal/instance:
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.546118 19979 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:18.546869 19993 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.547035 19979 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:18.547101 19979 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.547158 19979 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:18.567118 19979 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:18.567399 19979 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:18.567513 19979 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:18.567705 19979 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:18.567999 19979 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:18.568032 19979 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.568064 19979 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:18.568092 19979 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.573820 19979 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:35069
I20251024 08:16:18.573889 20106 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:35069 every 8 connection(s)
I20251024 08:16:18.574158 19979 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:18.578526 20107 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:18.578614 20107 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:18.578805 20107 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:18.580418 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 19979
I20251024 08:16:18.580533 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal/instance
I20251024 08:16:18.582638 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:0
--local_ip_for_outbound_sockets=127.18.80.66
--webserver_interface=127.18.80.66
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:18.612144 19919 ts_manager.cc:194] Registered new tserver with Master: 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:18.613277 19919 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:59727
W20251024 08:16:18.660789 20111 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:18.661002 20111 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:18.661026 20111 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:18.662284 20111 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:18.662328 20111 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:18.663597 20111 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.20111
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:18.663767 20111 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:18.663950 20111 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:18.666467 20116 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.666592 20117 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.666525 20111 server_base.cc:1047] running on GCE node
W20251024 08:16:18.666515 20119 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.666797 20111 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:18.667001 20111 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:18.668125 20111 hybrid_clock.cc:648] HybridClock initialized: now 1761293778668118 us; error 27 us; skew 500 ppm
I20251024 08:16:18.669250 20111 webserver.cc:492] Webserver started at http://127.18.80.66:35129/ using document root <none> and password file <none>
I20251024 08:16:18.669450 20111 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:18.669493 20111 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:18.669595 20111 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:18.670478 20111 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/instance:
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.670811 20111 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal/instance:
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.671958 20111 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.003s	sys 0.000s
I20251024 08:16:18.672607 20125 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.672760 20111 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:18.672829 20111 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.672911 20111 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:18.684516 20111 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:18.684772 20111 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:18.684921 20111 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:18.685141 20111 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:18.685456 20111 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:18.685489 20111 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.685523 20111 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:18.685544 20111 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.690896 20111 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:39115
I20251024 08:16:18.690958 20238 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:39115 every 8 connection(s)
I20251024 08:16:18.691251 20111 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:18.695375 20239 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:18.695456 20239 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:18.695653 20239 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:18.697624 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 20111
I20251024 08:16:18.697718 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal/instance
I20251024 08:16:18.699088 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:0
--local_ip_for_outbound_sockets=127.18.80.67
--webserver_interface=127.18.80.67
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:18.727335 19919 ts_manager.cc:194] Registered new tserver with Master: 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115)
I20251024 08:16:18.727835 19919 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:48799
W20251024 08:16:18.778698 20243 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:18.778869 20243 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:18.778887 20243 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:18.780314 20243 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:18.780371 20243 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:18.781747 20243 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.20243
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:18.781986 20243 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:18.782203 20243 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:18.784664 20248 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.784678 20249 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.784730 20243 server_base.cc:1047] running on GCE node
W20251024 08:16:18.784751 20251 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.785051 20243 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:18.785238 20243 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:18.786362 20243 hybrid_clock.cc:648] HybridClock initialized: now 1761293778786350 us; error 27 us; skew 500 ppm
I20251024 08:16:18.787444 20243 webserver.cc:492] Webserver started at http://127.18.80.67:43985/ using document root <none> and password file <none>
I20251024 08:16:18.787672 20243 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:18.787720 20243 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:18.787837 20243 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:18.788679 20243 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/instance:
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.789027 20243 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal/instance:
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.790208 20243 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:18.790907 20257 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.791064 20243 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:18.791155 20243 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.791208 20243 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:18.819320 20243 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:18.819595 20243 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:18.819725 20243 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:18.819972 20243 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:18.820284 20243 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:18.820317 20243 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.820356 20243 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:18.820389 20243 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.825600 20243 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:40981
I20251024 08:16:18.825672 20370 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:40981 every 8 connection(s)
I20251024 08:16:18.825943 20243 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:18.830440 20371 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:18.830562 20371 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:18.830765 20371 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:18.835712 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 20243
I20251024 08:16:18.835791 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal/instance
I20251024 08:16:18.837177 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:0
--local_ip_for_outbound_sockets=127.18.80.68
--webserver_interface=127.18.80.68
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:18.862425 19919 ts_manager.cc:194] Registered new tserver with Master: e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:18.862891 19919 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:43277
W20251024 08:16:18.916633 20375 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:18.916823 20375 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:18.916844 20375 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:18.918294 20375 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:18.918355 20375 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:18.919865 20375 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=0
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.20375
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:18.920092 20375 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:18.920327 20375 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:18.922969 20380 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.922971 20383 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:18.923038 20381 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:18.923202 20375 server_base.cc:1047] running on GCE node
I20251024 08:16:18.923369 20375 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:18.923566 20375 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:18.924695 20375 hybrid_clock.cc:648] HybridClock initialized: now 1761293778924685 us; error 30 us; skew 500 ppm
I20251024 08:16:18.925855 20375 webserver.cc:492] Webserver started at http://127.18.80.68:33347/ using document root <none> and password file <none>
I20251024 08:16:18.926057 20375 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:18.926102 20375 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:18.926215 20375 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251024 08:16:18.927033 20375 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/instance:
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.927316 20375 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal/instance:
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.928438 20375 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.002s
I20251024 08:16:18.929112 20389 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.929325 20375 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251024 08:16:18.929395 20375 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:18.929492 20375 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:18.949766 20375 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:18.950035 20375 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:18.950129 20375 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:18.950325 20375 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:18.950664 20375 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:18.950696 20375 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.950723 20375 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:18.950750 20375 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:18.956029 20375 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:34051
I20251024 08:16:18.956094 20502 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:34051 every 8 connection(s)
I20251024 08:16:18.956405 20375 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:18.960611 20503 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:18.960698 20503 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:18.960932 20503 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:18.963920 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 20375
I20251024 08:16:18.963997 18753 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal/instance
I20251024 08:16:18.992170 19919 ts_manager.cc:194] Registered new tserver with Master: 36b0ebc9a5694a778497ec8d94aba993 (127.18.80.68:34051)
I20251024 08:16:18.992623 19919 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:32893
I20251024 08:16:18.998045 18753 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20251024 08:16:19.003430 18753 test_util.cc:276] Using random seed: 705357860
I20251024 08:16:19.009133 19919 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:57766:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20251024 08:16:19.015213 20305 tablet_service.cc:1505] Processing CreateTablet for tablet 29f8dacedea249e9883a604ac785b905 (DEFAULT_TABLE table=test-workload [id=840ba50926fd4b15a1a3f2eb1359b0e3]), partition=RANGE (key) PARTITION UNBOUNDED
I20251024 08:16:19.015535 20305 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29f8dacedea249e9883a604ac785b905. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:19.016182 20041 tablet_service.cc:1505] Processing CreateTablet for tablet 29f8dacedea249e9883a604ac785b905 (DEFAULT_TABLE table=test-workload [id=840ba50926fd4b15a1a3f2eb1359b0e3]), partition=RANGE (key) PARTITION UNBOUNDED
I20251024 08:16:19.016458 20041 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29f8dacedea249e9883a604ac785b905. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:19.016448 20173 tablet_service.cc:1505] Processing CreateTablet for tablet 29f8dacedea249e9883a604ac785b905 (DEFAULT_TABLE table=test-workload [id=840ba50926fd4b15a1a3f2eb1359b0e3]), partition=RANGE (key) PARTITION UNBOUNDED
I20251024 08:16:19.016734 20173 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 29f8dacedea249e9883a604ac785b905. 1 dirs total, 0 dirs full, 0 dirs failed
I20251024 08:16:19.018184 20526 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap starting.
I20251024 08:16:19.018577 20527 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap starting.
I20251024 08:16:19.018882 20526 tablet_bootstrap.cc:654] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:19.019114 20527 tablet_bootstrap.cc:654] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:19.019114 20526 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:19.019348 20527 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:19.019433 20528 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap starting.
I20251024 08:16:19.019903 20526 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: No bootstrap required, opened a new log
I20251024 08:16:19.019974 20526 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:19.020038 20528 tablet_bootstrap.cc:654] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Neither blocks nor log segments found. Creating new log.
I20251024 08:16:19.020068 20527 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: No bootstrap required, opened a new log
I20251024 08:16:19.020121 20527 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:19.020299 20528 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:19.020853 20528 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: No bootstrap required, opened a new log
I20251024 08:16:19.020963 20528 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:19.021551 20527 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.021526 20526 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.021680 20526 raft_consensus.cc:385] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:19.021680 20527 raft_consensus.cc:385] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:19.021708 20526 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Initialized, Role: FOLLOWER
I20251024 08:16:19.021708 20527 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Initialized, Role: FOLLOWER
I20251024 08:16:19.021801 20526 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.021802 20527 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.022017 20527 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:19.022017 20526 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:19.022050 20239 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:19.022114 20371 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:19.022454 20528 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.022575 20528 raft_consensus.cc:385] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251024 08:16:19.022601 20528 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Initialized, Role: FOLLOWER
I20251024 08:16:19.022681 20528 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.022900 20528 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.002s
I20251024 08:16:19.022994 20107 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:19.051904 20534 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:19.052026 20534 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.052340 20534 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:19.055356 20532 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:19.055430 20325 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
I20251024 08:16:19.055460 20532 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.055606 20325 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 0.
I20251024 08:16:19.055794 20532 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:19.055789 20193 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
I20251024 08:16:19.055814 19997 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:19.055907 20193 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 0.
I20251024 08:16:19.055934 20534 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251024 08:16:19.056002 20534 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:19.056030 20534 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:19.056747 20534 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.056910 20534 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 1 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:19.057041 20325 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:16:19.057106 20325 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:19.057214 20193 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:19.057287 20193 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 0 FOLLOWER]: Advancing to term 1
I20251024 08:16:19.057844 20325 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 1.
I20251024 08:16:19.057996 19997 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:19.058035 20193 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 1.
I20251024 08:16:19.058069 20534 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 FOLLOWER]: Leader election won for term 1
I20251024 08:16:19.058200 20534 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 LEADER]: Becoming Leader. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Running, Role: LEADER
I20251024 08:16:19.058301 20534 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.058825 20193 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
I20251024 08:16:19.058897 20193 raft_consensus.cc:2393] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate e9ac8f0e11a34e5fb1c19a793f211a56 in current term 1: Already voted for candidate 97d4708eb2b64571b34044be6da3d298 in this term.
I20251024 08:16:19.059108 19919 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 reported cstate change: term changed from 0 to 1, leader changed from <none> to 97d4708eb2b64571b34044be6da3d298 (127.18.80.65). New cstate: current_term: 1 leader_uuid: "97d4708eb2b64571b34044be6da3d298" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: HEALTHY } } }
I20251024 08:16:19.059814 20061 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:16:19.060086 20258 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:19.060214 20532 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20251024 08:16:19.071273 18753 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: 773ff64ed1b249db9be71c247d7cbf43,97d4708eb2b64571b34044be6da3d298,36b0ebc9a5694a778497ec8d94aba993,e9ac8f0e11a34e5fb1c19a793f211a56
W20251024 08:16:19.074882 20108 tablet.cc:2378] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20251024 08:16:19.076469 20372 tablet.cc:2378] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20251024 08:16:19.113996 20193 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Refusing update from remote peer 97d4708eb2b64571b34044be6da3d298: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:19.114281 20325 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 1 FOLLOWER]: Refusing update from remote peer 97d4708eb2b64571b34044be6da3d298: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251024 08:16:19.114557 20534 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:19.116539 20534 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Connected to new peer: Peer: permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251024 08:16:19.122875 20554 mvcc.cc:204] Tried to move back new op lower bound from 7214259319249965056 to 7214259319023525888. Current Snapshot: MvccSnapshot[applied={T|T < 7214259319248179200}]
I20251024 08:16:19.124946 20557 mvcc.cc:204] Tried to move back new op lower bound from 7214259319249965056 to 7214259319023525888. Current Snapshot: MvccSnapshot[applied={T|T < 7214259319248179200}]
I20251024 08:16:19.125612 20555 mvcc.cc:204] Tried to move back new op lower bound from 7214259319249965056 to 7214259319023525888. Current Snapshot: MvccSnapshot[applied={T|T < 7214259319248179200}]
I20251024 08:16:19.214603 19910 ts_manager.cc:295] Set tserver state for 773ff64ed1b249db9be71c247d7cbf43 to MAINTENANCE_MODE
I20251024 08:16:19.250701 19910 ts_manager.cc:295] Set tserver state for 36b0ebc9a5694a778497ec8d94aba993 to MAINTENANCE_MODE
I20251024 08:16:19.265683 19910 ts_manager.cc:295] Set tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 to MAINTENANCE_MODE
I20251024 08:16:19.319020 19910 ts_manager.cc:295] Set tserver state for 97d4708eb2b64571b34044be6da3d298 to MAINTENANCE_MODE
I20251024 08:16:19.414083 20173 tablet_service.cc:1460] Tablet server 773ff64ed1b249db9be71c247d7cbf43 set to quiescing
I20251024 08:16:19.414149 20173 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:19.565893 20041 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
I20251024 08:16:19.565958 20041 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:16:19.567771 20534 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: : Instructing follower e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:19.567852 20534 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 LEADER]: Signalling peer e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:19.568719 20325 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
 from {username='slave'} at 127.18.80.65:39847
I20251024 08:16:19.568825 20325 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:19.568852 20325 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 1 FOLLOWER]: Advancing to term 2
I20251024 08:16:19.570986 20325 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.571197 20324 raft_consensus.cc:1240] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Rejecting Update request from peer 97d4708eb2b64571b34044be6da3d298 for earlier term 1. Current term is 2. Ops: []
I20251024 08:16:19.571671 20325 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 2 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:19.571816 20192 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 2 candidate_status { last_received { term: 1 index: 298 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:19.572000 20570 consensus_queue.cc:1059] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 }, Status: INVALID_TERM, Last received: 1.298, Next index: 299, Last known committed idx: 296, Time since last communication: 0.000s
I20251024 08:16:19.572158 20611 raft_consensus.cc:3055] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 LEADER]: Stepping down as leader of term 1
I20251024 08:16:19.572192 20611 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Running, Role: LEADER
I20251024 08:16:19.572237 20611 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 298, Committed index: 298, Last appended: 1.301, Last appended by leader: 301, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:19.572381 20611 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 1 FOLLOWER]: Advancing to term 2
I20251024 08:16:19.573042 20061 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 2 candidate_status { last_received { term: 1 index: 298 } } ignore_live_leader: true dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:19.573124 20061 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Leader election vote request: Denying vote to candidate e9ac8f0e11a34e5fb1c19a793f211a56 for term 2 because replica has last-logged OpId of term: 1 index: 301, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 298.
I20251024 08:16:19.573364 20258 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 2 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:19.573486 20532 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Leader election lost for term 2. Reason: could not achieve majority
I20251024 08:16:19.587754 20305 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:19.587822 20305 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:19.626665 20437 tablet_service.cc:1460] Tablet server 36b0ebc9a5694a778497ec8d94aba993 set to quiescing
I20251024 08:16:19.626727 20437 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:19.827953 20611 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:19.871901 20653 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:19.963971 20532 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: failed to trigger leader election: Illegal state: leader elections are disabled
I20251024 08:16:19.993580 20503 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:20.707063 20041 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
I20251024 08:16:20.707124 20041 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:20.762887 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 20111
I20251024 08:16:20.768201 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:39115
--local_ip_for_outbound_sockets=127.18.80.66
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=35129
--webserver_interface=127.18.80.66
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:20.843736 20665 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:20.843928 20665 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:20.843950 20665 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:20.845324 20665 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:20.845372 20665 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:20.846830 20665 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:39115
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=35129
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.20665
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:20.847020 20665 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:20.847198 20665 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:20.849674 20673 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:20.849687 20670 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:20.849687 20671 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:20.850157 20665 server_base.cc:1047] running on GCE node
I20251024 08:16:20.850307 20665 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:20.850490 20665 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:20.851615 20665 hybrid_clock.cc:648] HybridClock initialized: now 1761293780851602 us; error 30 us; skew 500 ppm
I20251024 08:16:20.852703 20665 webserver.cc:492] Webserver started at http://127.18.80.66:35129/ using document root <none> and password file <none>
I20251024 08:16:20.852968 20665 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:20.853036 20665 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:20.854194 20665 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:20.854846 20679 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:20.855003 20665 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251024 08:16:20.855083 20665 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:20.855341 20665 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:20.862897 20665 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:20.863137 20665 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:20.863246 20665 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:20.863412 20665 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:20.863798 20686 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:20.864688 20665 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:20.864732 20665 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:20.864778 20665 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:20.865403 20665 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:20.865450 20665 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:20.865501 20686 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap starting.
I20251024 08:16:20.872402 20665 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:39115
I20251024 08:16:20.872467 20793 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:39115 every 8 connection(s)
I20251024 08:16:20.872761 20665 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:20.878176 20794 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:20.878269 20794 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:20.878468 20794 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:20.879042 19911 ts_manager.cc:194] Re-registered known tserver with Master: 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115)
I20251024 08:16:20.879196 20686 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:20.879487 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:34635
I20251024 08:16:20.882844 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 20665
I20251024 08:16:20.882967 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19979
W20251024 08:16:20.888022 20512 meta_cache.cc:302] tablet 29f8dacedea249e9883a604ac785b905: replica 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069) has failed: Network error: recv got EOF from 127.18.80.65:35069 (error 108)
I20251024 08:16:20.888562 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:35069
--local_ip_for_outbound_sockets=127.18.80.65
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=40139
--webserver_interface=127.18.80.65
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:20.898942 20283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:20.898942 20284 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:20.902935 20284 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:20.920092 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:20.924165 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:20.925266 20686 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 1/1 log segments. Stats: ops{read=300 overwritten=0 applied=298 ignored=0} inserts{seen=14850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:20.925601 20686 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap complete.
I20251024 08:16:20.926529 20686 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent bootstrapping tablet: real 0.061s	user 0.043s	sys 0.016s
W20251024 08:16:20.927351 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:20.927559 20686 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:20.928328 20686 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Initialized, Role: FOLLOWER
I20251024 08:16:20.928447 20686 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 298, Last appended: 1.300, Last appended by leader: 300, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:20.928665 20794 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:20.928671 20686 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.001s
W20251024 08:16:20.940855 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:20.953032 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:20.970305 20799 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:20.970503 20799 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:20.970526 20799 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:20.971952 20799 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:20.972007 20799 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:20.973480 20799 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:35069
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=40139
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.20799
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:20.973716 20799 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:20.973934 20799 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:20.976339 20809 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:20.976374 20808 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:20.976547 20799 server_base.cc:1047] running on GCE node
W20251024 08:16:20.976353 20811 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:20.976830 20799 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:20.977056 20799 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
W20251024 08:16:20.977685 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:20.978183 20799 hybrid_clock.cc:648] HybridClock initialized: now 1761293780978168 us; error 32 us; skew 500 ppm
I20251024 08:16:20.979212 20799 webserver.cc:492] Webserver started at http://127.18.80.65:40139/ using document root <none> and password file <none>
I20251024 08:16:20.979372 20799 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:20.979416 20799 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:20.980593 20799 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:20.981268 20817 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:20.981437 20799 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:20.981499 20799 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:20.981761 20799 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251024 08:16:20.993345 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:21.005086 20799 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:21.005373 20799 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:21.005501 20799 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:21.005713 20799 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:21.006146 20824 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:21.007021 20799 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:21.007062 20799 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:21.007086 20799 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:21.007609 20799 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:21.007639 20799 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:21.007712 20824 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap starting.
I20251024 08:16:21.013685 20799 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:35069
I20251024 08:16:21.013748 20931 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:35069 every 8 connection(s)
I20251024 08:16:21.014032 20799 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:21.018967 20932 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:21.019063 20932 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:21.019238 20932 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:21.019802 19911 ts_manager.cc:194] Re-registered known tserver with Master: 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:21.020382 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:49589
I20251024 08:16:21.023401 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 20799
I20251024 08:16:21.023486 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 20375
I20251024 08:16:21.024190 20824 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:21.026216 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:21.028353 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:34051
--local_ip_for_outbound_sockets=127.18.80.68
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=33347
--webserver_interface=127.18.80.68
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:21.044154 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:21.066344 20824 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 1/1 log segments. Stats: ops{read=301 overwritten=0 applied=298 ignored=0} inserts{seen=14850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:21.066672 20824 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap complete.
I20251024 08:16:21.067611 20824 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent bootstrapping tablet: real 0.060s	user 0.052s	sys 0.004s
I20251024 08:16:21.068944 20824 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.069577 20824 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Initialized, Role: FOLLOWER
I20251024 08:16:21.069710 20824 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 298, Last appended: 1.301, Last appended by leader: 301, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.069949 20824 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
I20251024 08:16:21.070003 20932 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
W20251024 08:16:21.088583 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:21.107437 20285 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34238: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:21.107573 20936 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:21.107782 20936 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:21.107816 20936 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:21.109206 20936 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:21.109266 20936 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:21.110870 20936 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:34051
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=33347
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.20936
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:21.111130 20936 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:21.111375 20936 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:21.113853 20947 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:21.113888 20945 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:21.113878 20944 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:21.114058 20936 server_base.cc:1047] running on GCE node
I20251024 08:16:21.114217 20936 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:21.114414 20936 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:21.115578 20936 hybrid_clock.cc:648] HybridClock initialized: now 1761293781115545 us; error 46 us; skew 500 ppm
I20251024 08:16:21.116747 20936 webserver.cc:492] Webserver started at http://127.18.80.68:33347/ using document root <none> and password file <none>
I20251024 08:16:21.117017 20936 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:21.117086 20936 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:21.118216 20936 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:21.118764 20953 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:21.118925 20936 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:21.118991 20936 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:21.119235 20936 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:21.157500 20936 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:21.157770 20936 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:21.157888 20936 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:21.158079 20936 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:21.158372 20936 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:21.158402 20936 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:21.158453 20936 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:21.158479 20936 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:21.163657 20936 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:34051
I20251024 08:16:21.163702 21066 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:34051 every 8 connection(s)
I20251024 08:16:21.164011 20936 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:21.168087 21067 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:21.168184 21067 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:21.168375 21067 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:21.168720 19911 ts_manager.cc:194] Re-registered known tserver with Master: 36b0ebc9a5694a778497ec8d94aba993 (127.18.80.68:34051)
I20251024 08:16:21.169097 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:49115
I20251024 08:16:21.173614 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 20936
I20251024 08:16:21.173702 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 20243
I20251024 08:16:21.179561 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:40981
--local_ip_for_outbound_sockets=127.18.80.67
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=43985
--webserver_interface=127.18.80.67
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:21.256441 21070 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:21.256619 21070 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:21.256645 21070 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:21.258051 21070 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:21.258117 21070 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:21.259666 21070 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:40981
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=43985
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.21070
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:21.259910 21070 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:21.260143 21070 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:21.262714 21075 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:21.262719 21078 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:21.262719 21076 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:21.262909 21070 server_base.cc:1047] running on GCE node
I20251024 08:16:21.263111 21070 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:21.263314 21070 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:21.264452 21070 hybrid_clock.cc:648] HybridClock initialized: now 1761293781264429 us; error 38 us; skew 500 ppm
I20251024 08:16:21.265616 21070 webserver.cc:492] Webserver started at http://127.18.80.67:43985/ using document root <none> and password file <none>
I20251024 08:16:21.265807 21070 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:21.265854 21070 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:21.268105 21070 fs_manager.cc:714] Time spent opening directory manager: real 0.002s	user 0.001s	sys 0.000s
I20251024 08:16:21.268694 21084 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:21.268836 21070 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:21.268924 21070 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:21.269192 21070 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:21.281816 21070 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:21.282078 21070 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:21.282191 21070 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:21.282375 21070 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:21.282794 21091 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:21.283636 21070 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:21.283675 21070 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:21.283712 21070 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:21.284226 21070 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:21.284256 21070 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:21.284300 21091 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap starting.
I20251024 08:16:21.291096 21070 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:40981
I20251024 08:16:21.291175 21198 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:40981 every 8 connection(s)
I20251024 08:16:21.291481 21070 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:21.294373 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 21070
I20251024 08:16:21.296149 21199 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:21.296253 21199 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:21.296394 20803 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:21.296482 21199 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:21.296497 20803 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.296820 20803 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:21.297646 19911 ts_manager.cc:194] Re-registered known tserver with Master: e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:21.298027 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:37645
I20251024 08:16:21.300218 20886 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 2 candidate_status { last_received { term: 1 index: 300 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:16:21.300364 20886 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 773ff64ed1b249db9be71c247d7cbf43 for term 2 because replica has last-logged OpId of term: 1 index: 301, which is greater than that of the candidate, which has last-logged OpId of term: 1 index: 300.
I20251024 08:16:21.300452 21091 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:21.303447 21153 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 2 candidate_status { last_received { term: 1 index: 300 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
W20251024 08:16:21.304288 20683 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 2 pre-election: Tablet error from VoteRequest() call to peer e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:21.304361 20683 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43; no voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56
I20251024 08:16:21.304490 20803 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 1 FOLLOWER]: Advancing to term 2
I20251024 08:16:21.305719 20803 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 2 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251024 08:16:21.350623 21091 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 1/1 log segments. Stats: ops{read=298 overwritten=0 applied=296 ignored=0} inserts{seen=14750 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:21.351022 21091 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap complete.
I20251024 08:16:21.351946 21091 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent bootstrapping tablet: real 0.068s	user 0.046s	sys 0.016s
I20251024 08:16:21.352422 21091 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.353097 21091 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Initialized, Role: FOLLOWER
I20251024 08:16:21.353225 21091 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 296, Last appended: 1.298, Last appended by leader: 298, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.353435 21091 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent starting tablet: real 0.001s	user 0.004s	sys 0.000s
I20251024 08:16:21.353549 21199 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:21.375830 20939 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:21.375977 20939 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.376266 20939 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:21.379329 21153 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 3 candidate_status { last_received { term: 1 index: 301 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
I20251024 08:16:21.379482 21153 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 2.
I20251024 08:16:21.379444 20748 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 3 candidate_status { last_received { term: 1 index: 301 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
I20251024 08:16:21.379566 20748 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 2.
I20251024 08:16:21.379734 20821 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:21.379926 20939 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20251024 08:16:21.379984 20939 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:21.380021 20939 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 2 FOLLOWER]: Advancing to term 3
I20251024 08:16:21.380992 20939 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.381119 20939 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 3 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:21.381254 21153 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 3 candidate_status { last_received { term: 1 index: 301 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:16:21.381304 20748 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 3 candidate_status { last_received { term: 1 index: 301 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:21.381343 21153 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 2 FOLLOWER]: Advancing to term 3
I20251024 08:16:21.381366 20748 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 2 FOLLOWER]: Advancing to term 3
I20251024 08:16:21.381889 20748 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 3.
I20251024 08:16:21.382050 20818 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298; no voters: 
I20251024 08:16:21.382126 20939 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 FOLLOWER]: Leader election won for term 3
I20251024 08:16:21.382242 20939 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 LEADER]: Becoming Leader. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Running, Role: LEADER
I20251024 08:16:21.382347 21153 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 97d4708eb2b64571b34044be6da3d298 in term 3.
I20251024 08:16:21.382339 20939 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 298, Committed index: 298, Last appended: 1.301, Last appended by leader: 301, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:21.382972 19911 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 reported cstate change: term changed from 1 to 3. New cstate: current_term: 3 leader_uuid: "97d4708eb2b64571b34044be6da3d298" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: HEALTHY } } }
I20251024 08:16:21.422915 20866 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:16:21.422974 21133 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:21.423655 21001 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:21.424357 20728 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:21.490073 21153 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 3 FOLLOWER]: Refusing update from remote peer 97d4708eb2b64571b34044be6da3d298: Log matching property violated. Preceding OpId in replica: term: 1 index: 298. Preceding OpId from leader: term: 3 index: 302. (index mismatch)
I20251024 08:16:21.490407 20939 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 302, Last known committed idx: 296, Time since last communication: 0.000s
I20251024 08:16:21.493156 20748 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 3 FOLLOWER]: Refusing update from remote peer 97d4708eb2b64571b34044be6da3d298: Log matching property violated. Preceding OpId in replica: term: 1 index: 300. Preceding OpId from leader: term: 3 index: 302. (index mismatch)
I20251024 08:16:21.493614 20939 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Connected to new peer: Peer: permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 302, Last known committed idx: 298, Time since last communication: 0.000s
W20251024 08:16:22.022909 20543 scanner-internal.cc:458] Time spent opening tablet: real 2.311s	user 0.001s	sys 0.000s
W20251024 08:16:22.023275 20544 scanner-internal.cc:458] Time spent opening tablet: real 2.311s	user 0.000s	sys 0.001s
W20251024 08:16:22.024216 20542 scanner-internal.cc:458] Time spent opening tablet: real 2.313s	user 0.001s	sys 0.000s
I20251024 08:16:22.169970 21067 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:26.705756 20728 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:26.710301 21001 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:26.710342 20866 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:16:26.717409 21133 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:27.101593 19911 ts_manager.cc:284] Unset tserver state for 36b0ebc9a5694a778497ec8d94aba993 from MAINTENANCE_MODE
I20251024 08:16:27.105105 19911 ts_manager.cc:284] Unset tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 from MAINTENANCE_MODE
I20251024 08:16:27.175684 21067 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:27.223887 19911 ts_manager.cc:284] Unset tserver state for 773ff64ed1b249db9be71c247d7cbf43 from MAINTENANCE_MODE
I20251024 08:16:27.235015 19911 ts_manager.cc:284] Unset tserver state for 97d4708eb2b64571b34044be6da3d298 from MAINTENANCE_MODE
I20251024 08:16:27.470911 19911 ts_manager.cc:295] Set tserver state for 773ff64ed1b249db9be71c247d7cbf43 to MAINTENANCE_MODE
I20251024 08:16:27.498080 21199 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:27.500074 20794 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:27.510183 20932 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:27.618068 19911 ts_manager.cc:295] Set tserver state for 97d4708eb2b64571b34044be6da3d298 to MAINTENANCE_MODE
I20251024 08:16:27.734784 20728 tablet_service.cc:1460] Tablet server 773ff64ed1b249db9be71c247d7cbf43 set to quiescing
I20251024 08:16:27.734850 20728 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:27.766006 19911 ts_manager.cc:295] Set tserver state for 36b0ebc9a5694a778497ec8d94aba993 to MAINTENANCE_MODE
I20251024 08:16:27.766479 19910 ts_manager.cc:295] Set tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 to MAINTENANCE_MODE
I20251024 08:16:27.948493 20866 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
I20251024 08:16:27.948561 20866 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:16:27.949201 21271 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: : Instructing follower e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:27.949275 21271 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 LEADER]: Signalling peer e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:27.950057 21153 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
 from {username='slave'} at 127.18.80.65:40731
I20251024 08:16:27.950178 21153 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 3 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:27.950212 21153 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 3 FOLLOWER]: Advancing to term 4
I20251024 08:16:27.951172 21153 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:27.951445 21153 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 4 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:27.952504 21152 raft_consensus.cc:1240] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Rejecting Update request from peer 97d4708eb2b64571b34044be6da3d298 for earlier term 3. Current term is 4. Ops: []
I20251024 08:16:27.953258 21276 consensus_queue.cc:1059] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 }, Status: INVALID_TERM, Last received: 3.6441, Next index: 6442, Last known committed idx: 6440, Time since last communication: 0.000s
I20251024 08:16:27.953449 21271 raft_consensus.cc:3055] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 LEADER]: Stepping down as leader of term 3
I20251024 08:16:27.953480 21271 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 LEADER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Running, Role: LEADER
I20251024 08:16:27.953512 21271 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 6441, Committed index: 6441, Last appended: 3.6442, Last appended by leader: 6442, Current term: 3, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:27.953645 21001 tablet_service.cc:1460] Tablet server 36b0ebc9a5694a778497ec8d94aba993 set to quiescing
I20251024 08:16:27.953691 21001 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:27.954159 21271 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 3 FOLLOWER]: Advancing to term 4
W20251024 08:16:27.955451 20938 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.003s ago.
W20251024 08:16:27.955809 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.957379 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.960913 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.962738 21112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:27.963797 20747 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 4 candidate_status { last_received { term: 3 index: 6441 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:27.963872 20747 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 3 FOLLOWER]: Advancing to term 4
I20251024 08:16:27.964447 20747 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate e9ac8f0e11a34e5fb1c19a793f211a56 for term 4 because replica has last-logged OpId of term: 3 index: 6442, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 6441.
I20251024 08:16:27.965030 20886 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 4 candidate_status { last_received { term: 3 index: 6441 } } ignore_live_leader: true dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:27.965123 20886 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 4 FOLLOWER]: Leader election vote request: Denying vote to candidate e9ac8f0e11a34e5fb1c19a793f211a56 for term 4 because replica has last-logged OpId of term: 3 index: 6442, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 6441.
I20251024 08:16:27.965272 21085 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 4 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:27.965502 21443 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Leader election lost for term 4. Reason: could not achieve majority
W20251024 08:16:27.967614 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.969741 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.971519 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.975996 20708 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.976009 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.982611 21109 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.983685 21112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:27.984004 21133 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:27.984054 21133 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
W20251024 08:16:27.990644 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:27.992602 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.000115 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.001602 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.010200 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.012305 21109 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.022297 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.024047 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.035578 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.037266 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.049450 21109 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.051497 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.064482 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.068358 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.080876 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.086529 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.098102 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.102272 21112 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.116379 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.119158 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.133685 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.140318 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.156031 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.159984 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.176131 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:28.179515 21067 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
W20251024 08:16:28.182873 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.198508 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.207172 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.221513 21110 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.230904 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.244676 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.254711 21454 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:28.255074 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.267802 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.279531 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.296124 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.307405 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.308678 21443 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:28.315989 21271 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:28.322891 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.337826 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.352777 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.366693 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.380556 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.394785 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.411154 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.423746 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.441829 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.456544 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.473446 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.490698 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.508235 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.522843 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.542636 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.558416 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.580384 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.594421 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.616932 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.629945 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.654731 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.666436 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.695545 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.706712 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.734014 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.747906 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.775818 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.789577 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.819607 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.828176 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.862326 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.872071 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.904857 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.916709 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.948733 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.960740 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:28.994076 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.004734 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.043128 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.052059 20707 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:52344: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.090701 20866 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
I20251024 08:16:29.090777 20866 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:29.090880 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.097114 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.127596 21133 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:29.127660 21133 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:29.141563 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.144101 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.183768 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 20665
W20251024 08:16:29.191358 20512 connection.cc:537] client connection to 127.18.80.66:39115 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251024 08:16:29.191488 20512 meta_cache.cc:302] tablet 29f8dacedea249e9883a604ac785b905: replica 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251024 08:16:29.191808 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:39115
--local_ip_for_outbound_sockets=127.18.80.66
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=35129
--webserver_interface=127.18.80.66
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:29.195165 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.239481 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.244993 20844 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:38400: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.268507 21477 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:29.268705 21477 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:29.268738 21477 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:29.270246 21477 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:29.270359 21477 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:29.271936 21477 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:39115
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=35129
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.21477
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:29.272205 21477 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:29.272467 21477 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:29.274966 21485 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:29.275072 21477 server_base.cc:1047] running on GCE node
W20251024 08:16:29.274976 21482 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.274966 21483 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:29.275305 21477 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:29.275490 21477 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:29.276633 21477 hybrid_clock.cc:648] HybridClock initialized: now 1761293789276624 us; error 23 us; skew 500 ppm
I20251024 08:16:29.277724 21477 webserver.cc:492] Webserver started at http://127.18.80.66:35129/ using document root <none> and password file <none>
I20251024 08:16:29.277921 21477 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:29.277967 21477 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:29.279038 21477 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:29.279640 21491 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.279794 21477 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:29.279861 21477 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:29.280108 21477 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:29.287537 21477 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:29.287801 21477 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:29.287905 21477 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:29.288066 21477 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:29.288443 21498 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:29.289295 21477 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:29.289337 21477 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:29.289383 21477 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:29.289875 21477 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:29.289908 21477 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:29.289955 21498 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap starting.
W20251024 08:16:29.295761 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.296257 21477 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:39115
I20251024 08:16:29.296329 21606 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:39115 every 8 connection(s)
I20251024 08:16:29.296597 21477 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:29.301705 21607 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:29.301821 21607 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:29.302044 21607 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:29.302589 19910 ts_manager.cc:194] Re-registered known tserver with Master: 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115)
I20251024 08:16:29.303128 19910 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:51637
I20251024 08:16:29.306516 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 21477
I20251024 08:16:29.306629 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 20799
I20251024 08:16:29.314551 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:35069
--local_ip_for_outbound_sockets=127.18.80.65
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=40139
--webserver_interface=127.18.80.65
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:29.318629 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.329043 21498 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:29.340761 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.367291 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.405802 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.410447 21611 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:29.410637 21611 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:29.410670 21611 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:29.412109 21611 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:29.412168 21611 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:29.413625 21611 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:35069
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=40139
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.21611
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:29.413838 21611 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:29.414059 21611 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:29.416517 21619 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.416592 21618 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.416517 21621 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:29.416723 21611 server_base.cc:1047] running on GCE node
I20251024 08:16:29.416920 21611 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:29.417187 21611 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:29.418289 21611 hybrid_clock.cc:648] HybridClock initialized: now 1761293789418278 us; error 33 us; skew 500 ppm
I20251024 08:16:29.419529 21611 webserver.cc:492] Webserver started at http://127.18.80.65:40139/ using document root <none> and password file <none>
I20251024 08:16:29.419726 21611 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:29.419770 21611 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:29.421000 21611 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:29.421712 21627 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.422652 21611 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:29.422734 21611 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:29.422984 21611 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:29.436300 21611 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:29.436558 21611 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:29.436699 21611 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:29.436939 21611 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:29.437381 21634 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:29.438468 21611 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:29.438512 21611 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:29.438557 21611 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:29.439060 21611 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:29.439091 21611 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:29.439168 21634 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap starting.
I20251024 08:16:29.446038 21611 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:35069
I20251024 08:16:29.446389 21611 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
W20251024 08:16:29.449127 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.449800 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 21611
I20251024 08:16:29.449894 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 20936
W20251024 08:16:29.451179 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.455889 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:34051
--local_ip_for_outbound_sockets=127.18.80.68
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=33347
--webserver_interface=127.18.80.68
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:29.458081 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.461763 21741 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:35069 every 8 connection(s)
I20251024 08:16:29.463191 21742 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:29.463292 21742 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:29.463526 21742 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:29.464089 19910 ts_manager.cc:194] Re-registered known tserver with Master: 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:29.464589 19910 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:57527
I20251024 08:16:29.496838 21634 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:29.511078 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:29.539276 21745 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:29.539451 21745 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:29.539485 21745 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:29.540992 21745 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:29.541057 21745 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:29.542666 21745 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:34051
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=33347
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.21745
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:29.542917 21745 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:29.543148 21745 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:29.545414 21752 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.545557 21755 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.546123 21753 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:29.546452 21745 server_base.cc:1047] running on GCE node
I20251024 08:16:29.546622 21745 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:29.546841 21745 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:29.550056 21745 hybrid_clock.cc:648] HybridClock initialized: now 1761293789550038 us; error 30 us; skew 500 ppm
I20251024 08:16:29.551226 21745 webserver.cc:492] Webserver started at http://127.18.80.68:33347/ using document root <none> and password file <none>
I20251024 08:16:29.551472 21745 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:29.551541 21745 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:29.553023 21745 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:29.553625 21761 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.553799 21745 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.553874 21745 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:29.554139 21745 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:29.571959 21745 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:29.572245 21745 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:29.572358 21745 kserver.cc:163] Server-wide thread pool size limit: 3276
W20251024 08:16:29.572383 21113 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:34268: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:29.572582 21745 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:29.572969 21745 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:29.573010 21745 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.573042 21745 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:29.573063 21745 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.579783 21745 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:34051
I20251024 08:16:29.580140 21745 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:29.580680 21874 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:34051 every 8 connection(s)
I20251024 08:16:29.581808 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 21745
I20251024 08:16:29.581892 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 21070
I20251024 08:16:29.591084 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:40981
--local_ip_for_outbound_sockets=127.18.80.67
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=43985
--webserver_interface=127.18.80.67
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:29.594247 21875 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:29.594372 21875 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:29.594609 21875 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:29.594992 19910 ts_manager.cc:194] Re-registered known tserver with Master: 36b0ebc9a5694a778497ec8d94aba993 (127.18.80.68:34051)
I20251024 08:16:29.595474 19910 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:42267
I20251024 08:16:29.628562 20544 meta_cache.cc:1510] marking tablet server e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981) as failed
W20251024 08:16:29.673676 21878 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:29.673847 21878 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:29.673877 21878 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:29.675334 21878 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:29.675403 21878 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:29.676857 21878 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:40981
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=43985
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.21878
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:29.677250 21878 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:29.677508 21878 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:29.679900 21883 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.679982 21886 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:29.680105 21884 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:29.680460 21878 server_base.cc:1047] running on GCE node
I20251024 08:16:29.680634 21878 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:29.680853 21878 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:29.682013 21878 hybrid_clock.cc:648] HybridClock initialized: now 1761293789682004 us; error 28 us; skew 500 ppm
I20251024 08:16:29.683140 21878 webserver.cc:492] Webserver started at http://127.18.80.67:43985/ using document root <none> and password file <none>
I20251024 08:16:29.683328 21878 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:29.683372 21878 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:29.684592 21878 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:29.685273 21892 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.685426 21878 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:29.685492 21878 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:29.685755 21878 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:29.699878 21878 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:29.700167 21878 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:29.700299 21878 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:29.700549 21878 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:29.701079 21899 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:29.701925 21878 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:29.701977 21878 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:29.702013 21878 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:29.702569 21878 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:29.702608 21878 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:29.702733 21899 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap starting.
I20251024 08:16:29.711086 21878 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:40981
I20251024 08:16:29.711534 21878 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:29.712008 22006 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:40981 every 8 connection(s)
I20251024 08:16:29.718732 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 21878
I20251024 08:16:29.726235 22007 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:29.726476 22007 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:29.726797 22007 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:29.727381 19910 ts_manager.cc:194] Re-registered known tserver with Master: e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:29.727892 19910 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:59265
I20251024 08:16:29.759361 21899 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:29.893455 21541 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:29.907593 21809 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:29.917878 21941 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:29.927470 21658 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:30.303990 21607 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:30.465512 21742 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:30.503723 21498 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 1/2 log segments. Stats: ops{read=4826 overwritten=0 applied=4825 ignored=0} inserts{seen=241150 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:30.573068 21899 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 1/2 log segments. Stats: ops{read=4830 overwritten=0 applied=4828 ignored=0} inserts{seen=241300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:30.596304 21875 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:30.633730 21634 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 1/2 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:30.728940 22007 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:30.838037 21899 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 2/2 log segments. Stats: ops{read=6441 overwritten=0 applied=6440 ignored=0} inserts{seen=321900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:30.838474 21899 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap complete.
I20251024 08:16:30.841259 21899 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent bootstrapping tablet: real 1.139s	user 0.955s	sys 0.151s
I20251024 08:16:30.842322 21899 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:30.842911 21899 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Initialized, Role: FOLLOWER
I20251024 08:16:30.843106 21899 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6440, Last appended: 3.6441, Last appended by leader: 6441, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:30.843318 21899 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:30.904652 21498 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 2/2 log segments. Stats: ops{read=6442 overwritten=0 applied=6441 ignored=0} inserts{seen=321950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:30.905077 21498 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap complete.
I20251024 08:16:30.907774 21498 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent bootstrapping tablet: real 1.618s	user 1.369s	sys 0.243s
I20251024 08:16:30.908569 21498 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:30.909287 21498 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Initialized, Role: FOLLOWER
I20251024 08:16:30.909411 21498 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6441, Last appended: 3.6442, Last appended by leader: 6442, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:30.909644 21498 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
W20251024 08:16:30.980327 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:30.987238 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:31.013666 21634 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 2/2 log segments. Stats: ops{read=6442 overwritten=0 applied=6441 ignored=0} inserts{seen=321950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:31.014086 21634 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap complete.
I20251024 08:16:31.016758 21634 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent bootstrapping tablet: real 1.578s	user 1.363s	sys 0.199s
I20251024 08:16:31.017838 21634 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 4 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:31.018383 21634 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 4 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Initialized, Role: FOLLOWER
I20251024 08:16:31.018505 21634 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6441, Last appended: 3.6442, Last appended by leader: 6442, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:31.018703 21634 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
W20251024 08:16:31.058887 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:31.066449 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:31.138535 21521 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:31.144366 21521 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:31.149250 22048 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:31.149380 22048 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:31.149674 22048 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:31.153631 21696 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 5 candidate_status { last_received { term: 3 index: 6441 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:16:31.153664 21561 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 5 candidate_status { last_received { term: 3 index: 6441 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
I20251024 08:16:31.153789 21696 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 4 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate e9ac8f0e11a34e5fb1c19a793f211a56 for term 5 because replica has last-logged OpId of term: 3 index: 6442, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 6441.
I20251024 08:16:31.153808 21561 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate e9ac8f0e11a34e5fb1c19a793f211a56 for term 5 because replica has last-logged OpId of term: 3 index: 6442, which is greater than that of the candidate, which has last-logged OpId of term: 3 index: 6441.
I20251024 08:16:31.154032 21893 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:31.154174 22048 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Leader pre-election lost for term 5. Reason: could not achieve majority
I20251024 08:16:31.188578 22050 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:31.188776 22050 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:31.189148 22050 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 5 pre-election: Requested pre-vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:31.196578 21961 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 5 candidate_status { last_received { term: 3 index: 6442 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
I20251024 08:16:31.196753 21961 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 773ff64ed1b249db9be71c247d7cbf43 in term 4.
I20251024 08:16:31.197172 21495 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 5 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:31.197187 21696 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 5 candidate_status { last_received { term: 3 index: 6442 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:16:31.197341 21696 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 4 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 773ff64ed1b249db9be71c247d7cbf43 in term 4.
I20251024 08:16:31.197497 22050 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Leader pre-election won for term 5
I20251024 08:16:31.197589 22050 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:31.197618 22050 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 4 FOLLOWER]: Advancing to term 5
I20251024 08:16:31.198649 22050 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:31.198768 22050 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 5 election: Requested vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:31.198977 21961 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 5 candidate_status { last_received { term: 3 index: 6442 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:16:31.198973 21696 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 5 candidate_status { last_received { term: 3 index: 6442 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:31.199054 21961 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 4 FOLLOWER]: Advancing to term 5
I20251024 08:16:31.199054 21696 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 4 FOLLOWER]: Advancing to term 5
I20251024 08:16:31.200177 21696 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 773ff64ed1b249db9be71c247d7cbf43 in term 5.
I20251024 08:16:31.200397 21961 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 5 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 773ff64ed1b249db9be71c247d7cbf43 in term 5.
I20251024 08:16:31.200389 21492 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 5 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298; no voters: 
I20251024 08:16:31.200541 22050 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 FOLLOWER]: Leader election won for term 5
I20251024 08:16:31.200702 22050 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 LEADER]: Becoming Leader. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Running, Role: LEADER
I20251024 08:16:31.200803 22050 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6441, Committed index: 6441, Last appended: 3.6442, Last appended by leader: 6442, Current term: 5, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:31.201508 19911 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 reported cstate change: term changed from 3 to 5, leader changed from 97d4708eb2b64571b34044be6da3d298 (127.18.80.65) to 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66). New cstate: current_term: 5 leader_uuid: "773ff64ed1b249db9be71c247d7cbf43" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: UNKNOWN } } }
W20251024 08:16:31.219388 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:31.223407 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:31.280018 21696 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 5 FOLLOWER]: Refusing update from remote peer 773ff64ed1b249db9be71c247d7cbf43: Log matching property violated. Preceding OpId in replica: term: 3 index: 6442. Preceding OpId from leader: term: 5 index: 6443. (index mismatch)
I20251024 08:16:31.280339 22050 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6443, Last known committed idx: 6441, Time since last communication: 0.000s
I20251024 08:16:31.282785 21961 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 5 FOLLOWER]: Refusing update from remote peer 773ff64ed1b249db9be71c247d7cbf43: Log matching property violated. Preceding OpId in replica: term: 3 index: 6441. Preceding OpId from leader: term: 5 index: 6443. (index mismatch)
I20251024 08:16:31.283233 22050 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6443, Last known committed idx: 6440, Time since last communication: 0.000s
W20251024 08:16:31.283802 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:32.239785 20542 scanner-internal.cc:458] Time spent opening tablet: real 3.807s	user 0.001s	sys 0.000s
W20251024 08:16:32.243578 20543 scanner-internal.cc:458] Time spent opening tablet: real 3.805s	user 0.001s	sys 0.000s
W20251024 08:16:32.433710 20544 scanner-internal.cc:458] Time spent opening tablet: real 4.009s	user 0.001s	sys 0.000s
I20251024 08:16:35.205014 21541 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:16:35.215153 21658 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:35.221458 21941 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251024 08:16:35.222012 21809 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:35.532758 19913 ts_manager.cc:284] Unset tserver state for 36b0ebc9a5694a778497ec8d94aba993 from MAINTENANCE_MODE
I20251024 08:16:35.600473 21875 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:35.647635 19913 ts_manager.cc:284] Unset tserver state for 97d4708eb2b64571b34044be6da3d298 from MAINTENANCE_MODE
I20251024 08:16:35.692373 19913 ts_manager.cc:284] Unset tserver state for 773ff64ed1b249db9be71c247d7cbf43 from MAINTENANCE_MODE
I20251024 08:16:35.696705 19913 ts_manager.cc:284] Unset tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 from MAINTENANCE_MODE
I20251024 08:16:36.105341 19913 ts_manager.cc:295] Set tserver state for 773ff64ed1b249db9be71c247d7cbf43 to MAINTENANCE_MODE
I20251024 08:16:36.122905 19913 ts_manager.cc:295] Set tserver state for 97d4708eb2b64571b34044be6da3d298 to MAINTENANCE_MODE
I20251024 08:16:36.139258 19913 ts_manager.cc:295] Set tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 to MAINTENANCE_MODE
I20251024 08:16:36.145102 19913 ts_manager.cc:295] Set tserver state for 36b0ebc9a5694a778497ec8d94aba993 to MAINTENANCE_MODE
I20251024 08:16:36.286494 21742 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:36.290216 21607 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:36.291086 22007 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:36.352108 21541 tablet_service.cc:1460] Tablet server 773ff64ed1b249db9be71c247d7cbf43 set to quiescing
I20251024 08:16:36.352198 21541 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:16:36.355195 22082 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: : Instructing follower e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:36.355291 22082 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 LEADER]: Signalling peer e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:36.355680 21961 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
 from {username='slave'} at 127.18.80.66:55115
I20251024 08:16:36.355780 21961 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 5 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:36.355813 21961 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 5 FOLLOWER]: Advancing to term 6
I20251024 08:16:36.356684 21961 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 6 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:36.356822 21961 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 6 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:36.357161 21695 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 6 candidate_status { last_received { term: 5 index: 11182 } } ignore_live_leader: true dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:36.357671 22060 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: : Instructing follower e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:36.357738 22060 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 LEADER]: Signalling peer e9ac8f0e11a34e5fb1c19a793f211a56 to start an election
I20251024 08:16:36.357710 21561 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 6 candidate_status { last_received { term: 5 index: 11182 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:36.357777 21561 raft_consensus.cc:3055] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 LEADER]: Stepping down as leader of term 5
I20251024 08:16:36.357800 21561 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 LEADER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Running, Role: LEADER
I20251024 08:16:36.357867 21561 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 11182, Committed index: 11182, Last appended: 5.11182, Last appended by leader: 11182, Current term: 5, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:36.357951 21561 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 5 FOLLOWER]: Advancing to term 6
I20251024 08:16:36.358142 21961 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
 from {username='slave'} at 127.18.80.66:55115
I20251024 08:16:36.358196 21961 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 6 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:36.358217 21961 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 6 FOLLOWER]: Advancing to term 7
I20251024 08:16:36.358767 21561 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 6 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 6.
I20251024 08:16:36.358829 21961 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:36.358939 21961 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 7 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
W20251024 08:16:36.358922 21521 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.359161 21893 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 6 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:36.359164 21561 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 7 candidate_status { last_received { term: 5 index: 11182 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:36.359213 21561 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 6 FOLLOWER]: Advancing to term 7
I20251024 08:16:36.359390 21696 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 7 candidate_status { last_received { term: 5 index: 11182 } } ignore_live_leader: true dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:36.359468 21696 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 5 FOLLOWER]: Advancing to term 7
I20251024 08:16:36.359890 21561 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 7 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 7.
I20251024 08:16:36.360091 21893 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 7 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:36.361325 21696 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 7 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 7.
I20251024 08:16:36.361843 22234 raft_consensus.cc:2764] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 FOLLOWER]: Leader election decision vote started in defunct term 6: won
W20251024 08:16:36.361850 21521 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.361881 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.362504 22234 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 FOLLOWER]: Leader election won for term 7
I20251024 08:16:36.362833 22234 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 LEADER]: Becoming Leader. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Running, Role: LEADER
I20251024 08:16:36.363133 22234 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11182, Committed index: 11182, Last appended: 5.11182, Last appended by leader: 11182, Current term: 7, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:36.364547 19913 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 reported cstate change: term changed from 5 to 7, leader changed from 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66) to e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67). New cstate: current_term: 7 leader_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:36.366200 21696 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 7 FOLLOWER]: Refusing update from remote peer e9ac8f0e11a34e5fb1c19a793f211a56: Log matching property violated. Preceding OpId in replica: term: 5 index: 11180. Preceding OpId from leader: term: 7 index: 11184. (index mismatch)
I20251024 08:16:36.367226 22236 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11183, Last known committed idx: 11180, Time since last communication: 0.000s
I20251024 08:16:36.367416 21561 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 7 FOLLOWER]: Refusing update from remote peer e9ac8f0e11a34e5fb1c19a793f211a56: Log matching property violated. Preceding OpId in replica: term: 5 index: 11182. Preceding OpId from leader: term: 7 index: 11184. (index mismatch)
I20251024 08:16:36.367614 22236 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Connected to new peer: Peer: permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11183, Last known committed idx: 11182, Time since last communication: 0.000s
I20251024 08:16:36.448818 21941 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:36.448877 21941 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251024 08:16:36.449807 22241 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: : Instructing follower 97d4708eb2b64571b34044be6da3d298 to start an election
I20251024 08:16:36.449878 22241 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 LEADER]: Signalling peer 97d4708eb2b64571b34044be6da3d298 to start an election
I20251024 08:16:36.450078 21696 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "97d4708eb2b64571b34044be6da3d298"
 from {username='slave'} at 127.18.80.67:58773
I20251024 08:16:36.450165 21696 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 7 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:36.450198 21696 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 7 FOLLOWER]: Advancing to term 8
I20251024 08:16:36.450796 21696 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:36.451057 21696 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 8 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:36.452179 21696 raft_consensus.cc:1240] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Rejecting Update request from peer e9ac8f0e11a34e5fb1c19a793f211a56 for earlier term 7. Current term is 8. Ops: [7.11234-7.11234]
I20251024 08:16:36.452333 22236 consensus_queue.cc:1059] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: INVALID_TERM, Last received: 7.11233, Next index: 11234, Last known committed idx: 11233, Time since last communication: 0.000s
I20251024 08:16:36.452435 22236 raft_consensus.cc:3055] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 LEADER]: Stepping down as leader of term 7
I20251024 08:16:36.452471 22236 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 LEADER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Running, Role: LEADER
I20251024 08:16:36.452512 22236 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 11233, Committed index: 11233, Last appended: 7.11234, Last appended by leader: 11234, Current term: 7, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:36.452600 22236 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 7 FOLLOWER]: Advancing to term 8
W20251024 08:16:36.453729 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.454777 21961 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 8 candidate_status { last_received { term: 7 index: 11233 } } ignore_live_leader: true dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:16:36.454861 21961 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate 97d4708eb2b64571b34044be6da3d298 for term 8 because replica has last-logged OpId of term: 7 index: 11234, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 11233.
W20251024 08:16:36.455049 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.457304 21561 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 8 candidate_status { last_received { term: 7 index: 11233 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:36.457389 21561 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 7 FOLLOWER]: Advancing to term 8
I20251024 08:16:36.458173 21561 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate 97d4708eb2b64571b34044be6da3d298 for term 8 because replica has last-logged OpId of term: 7 index: 11234, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 11233.
I20251024 08:16:36.458314 21628 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 8 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298; no voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56
I20251024 08:16:36.458454 22239 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Leader election lost for term 8. Reason: could not achieve majority
W20251024 08:16:36.459671 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.460381 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.466921 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.466928 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.475052 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.475199 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.480654 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.483773 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.490321 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.493829 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.498092 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.501942 21921 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.507843 21809 tablet_service.cc:1460] Tablet server 36b0ebc9a5694a778497ec8d94aba993 set to quiescing
I20251024 08:16:36.507963 21809 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:36.508801 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.510336 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.517196 21658 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
I20251024 08:16:36.517292 21658 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:36.519861 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.521466 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.532662 21921 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.533854 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.546633 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.547284 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.562304 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.562304 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.575417 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.576490 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.591112 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.593008 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:36.601442 21875 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
W20251024 08:16:36.607638 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.607653 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.624758 21921 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.626749 21921 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.643433 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.644294 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.661839 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.662321 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.682336 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.683260 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.702896 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.705808 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.726352 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.727887 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.752852 22107 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:36.753003 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.753911 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.776476 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.779505 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.798821 22236 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:36.802096 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.804658 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.828722 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.832863 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.856295 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.860488 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.874080 22239 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:36.883087 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.888674 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.914939 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.918902 21902 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.945564 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.947492 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.980067 21638 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:36.980067 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.013339 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.013339 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.044997 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.045864 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.078502 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.082070 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.113353 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.117535 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.151012 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.153996 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.190598 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.192122 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.228542 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.232470 21904 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.269239 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.271075 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.310775 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.311199 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.352206 21920 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.353247 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.393901 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.396804 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.438647 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.440177 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.483508 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.486508 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:37.506199 21541 tablet_service.cc:1460] Tablet server 773ff64ed1b249db9be71c247d7cbf43 set to quiescing
I20251024 08:16:37.506258 21541 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:37.530778 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.531930 21519 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35186: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.578785 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.581398 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:37.599998 21941 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:37.600064 21941 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:37.626487 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.629642 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:37.655822 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 21477
I20251024 08:16:37.664727 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:39115
--local_ip_for_outbound_sockets=127.18.80.66
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=35129
--webserver_interface=127.18.80.66
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:37.674619 20512 meta_cache.cc:302] tablet 29f8dacedea249e9883a604ac785b905: replica 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115) has failed: Network error: Client connection negotiation failed: client connection to 127.18.80.66:39115: connect: Connection refused (error 111)
W20251024 08:16:37.679641 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.725617 21656 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55918: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.731952 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.741267 22296 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:37.741461 22296 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:37.741483 22296 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:37.742859 22296 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:37.742913 22296 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:37.744500 22296 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:39115
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=35129
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.22296
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:37.744702 22296 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:37.744966 22296 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:37.747859 22303 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:37.747870 22305 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:37.747906 22302 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:37.747886 22296 server_base.cc:1047] running on GCE node
I20251024 08:16:37.748242 22296 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:37.748463 22296 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:37.749588 22296 hybrid_clock.cc:648] HybridClock initialized: now 1761293797749574 us; error 34 us; skew 500 ppm
I20251024 08:16:37.750780 22296 webserver.cc:492] Webserver started at http://127.18.80.66:35129/ using document root <none> and password file <none>
I20251024 08:16:37.751008 22296 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:37.751050 22296 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:37.752312 22296 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:37.753077 22311 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:37.753242 22296 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:37.753304 22296 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:37.753552 22296 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:37.765105 22296 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:37.765347 22296 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:37.765431 22296 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:37.765594 22296 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:37.766026 22318 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:37.766914 22296 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:37.766956 22296 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:37.766981 22296 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:37.767514 22296 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:37.767546 22296 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:37.767608 22318 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap starting.
I20251024 08:16:37.775483 22296 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:39115
I20251024 08:16:37.775545 22425 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:39115 every 8 connection(s)
I20251024 08:16:37.775888 22296 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:37.779331 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 22296
I20251024 08:16:37.779428 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 21611
I20251024 08:16:37.782867 22426 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:37.783006 22426 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:37.783252 22426 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:37.783828 19911 ts_manager.cc:194] Re-registered known tserver with Master: 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115)
I20251024 08:16:37.784381 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:56511
W20251024 08:16:37.789137 20512 connection.cc:537] client connection to 127.18.80.65:35069 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251024 08:16:37.789686 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:35069
--local_ip_for_outbound_sockets=127.18.80.65
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=40139
--webserver_interface=127.18.80.65
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:37.825178 22318 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:37.826654 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:37.882740 22430 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:37.882900 22430 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:37.882920 22430 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:37.884308 22430 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:37.884368 22430 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:37.885879 22430 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:35069
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=40139
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.22430
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:37.886125 22430 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:37.886353 22430 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:37.888828 22438 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:37.888922 22430 server_base.cc:1047] running on GCE node
W20251024 08:16:37.888824 22437 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:37.889063 22440 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:37.889319 22430 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:37.889524 22430 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:37.890662 22430 hybrid_clock.cc:648] HybridClock initialized: now 1761293797890657 us; error 37 us; skew 500 ppm
I20251024 08:16:37.891889 22430 webserver.cc:492] Webserver started at http://127.18.80.65:40139/ using document root <none> and password file <none>
I20251024 08:16:37.892091 22430 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:37.892138 22430 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:37.893350 22430 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:37.894035 22446 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:37.894272 22430 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:37.894364 22430 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:37.894711 22430 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:37.895920 20542 meta_cache.cc:1510] marking tablet server 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069) as failed
W20251024 08:16:37.898111 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:37.898475 20543 meta_cache.cc:1510] marking tablet server 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069) as failed
W20251024 08:16:37.933846 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:37.942416 22430 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:37.942809 22430 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:37.942965 22430 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:37.943248 22430 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:37.943799 22453 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:37.944825 22430 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:37.944917 22430 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:37.944968 22430 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:37.945719 22430 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:37.945770 22430 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:37.945825 22453 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap starting.
I20251024 08:16:37.953586 22430 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:35069
I20251024 08:16:37.954015 22430 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:37.956060 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 22430
I20251024 08:16:37.956161 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 21745
I20251024 08:16:37.961153 22560 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:35069 every 8 connection(s)
I20251024 08:16:37.963603 22561 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:37.963798 22561 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:37.964089 22561 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:37.964352 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:34051
--local_ip_for_outbound_sockets=127.18.80.68
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=33347
--webserver_interface=127.18.80.68
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:37.964701 19911 ts_manager.cc:194] Re-registered known tserver with Master: 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:37.965269 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:53193
I20251024 08:16:38.007423 22453 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:38.071233 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:38.083669 22564 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:38.083935 22564 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:38.083971 22564 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:38.086289 22564 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:38.086383 22564 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:38.089066 22564 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:34051
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=33347
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.22564
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:38.089349 22564 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:38.089627 22564 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:38.092738 22572 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:38.092691 22574 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:38.093252 22571 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:38.093782 22564 server_base.cc:1047] running on GCE node
I20251024 08:16:38.093977 22564 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:38.094233 22564 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:38.095407 22564 hybrid_clock.cc:648] HybridClock initialized: now 1761293798095390 us; error 34 us; skew 500 ppm
I20251024 08:16:38.097007 22564 webserver.cc:492] Webserver started at http://127.18.80.68:33347/ using document root <none> and password file <none>
I20251024 08:16:38.097256 22564 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:38.097321 22564 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:38.098909 22564 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:38.099788 22580 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
W20251024 08:16:38.100286 21903 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:56600: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:38.100562 22564 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:38.100641 22564 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:38.101004 22564 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:38.126434 22564 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:38.126772 22564 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:38.126901 22564 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:38.127130 22564 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:38.127544 22564 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:38.127621 22564 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:38.127682 22564 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:38.127724 22564 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:38.135648 22564 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:34051
I20251024 08:16:38.135699 22693 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:34051 every 8 connection(s)
I20251024 08:16:38.136214 22564 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:38.141433 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 22564
I20251024 08:16:38.141532 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 21878
I20251024 08:16:38.143368 22694 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:38.143460 22694 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:38.143663 22694 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:38.144071 19911 ts_manager.cc:194] Re-registered known tserver with Master: 36b0ebc9a5694a778497ec8d94aba993 (127.18.80.68:34051)
I20251024 08:16:38.144526 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:40629
I20251024 08:16:38.152668 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:40981
--local_ip_for_outbound_sockets=127.18.80.67
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=43985
--webserver_interface=127.18.80.67
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:38.269834 22697 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:38.270079 22697 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:38.270115 22697 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:38.272540 22697 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:38.272639 22697 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:38.275223 22697 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:40981
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=43985
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.22697
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:38.275521 22697 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:38.275887 22697 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:38.279039 22705 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:38.279150 22703 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:38.279062 22702 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:38.280288 22697 server_base.cc:1047] running on GCE node
I20251024 08:16:38.280484 22697 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:38.280755 22697 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:38.281930 22697 hybrid_clock.cc:648] HybridClock initialized: now 1761293798281911 us; error 33 us; skew 500 ppm
I20251024 08:16:38.284870 22697 webserver.cc:492] Webserver started at http://127.18.80.67:43985/ using document root <none> and password file <none>
I20251024 08:16:38.285178 22697 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:38.285248 22697 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:38.287000 22697 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:38.287979 22711 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:38.288147 22697 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:38.288215 22697 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:38.288555 22697 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:38.318073 22697 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:38.318396 22697 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:38.318535 22697 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:38.318810 22697 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:38.319406 22718 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:38.320627 22697 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:38.320688 22697 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:38.320724 22697 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:38.321563 22697 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:38.321642 22697 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:38.322576 22718 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap starting.
I20251024 08:16:38.328536 22697 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:40981
I20251024 08:16:38.329054 22697 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:38.330669 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 22697
I20251024 08:16:38.341383 22825 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:40981 every 8 connection(s)
I20251024 08:16:38.357147 22826 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:38.357270 22826 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:38.357518 22826 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:38.358210 19911 ts_manager.cc:194] Re-registered known tserver with Master: e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:38.358721 19911 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:53431
I20251024 08:16:38.398406 22718 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:38.511459 22492 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:38.516490 22756 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:38.519641 22360 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:38.532146 22628 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:38.785389 22426 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:38.878980 22318 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 1/3 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:38.966223 22561 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:39.072467 22453 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 1/3 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:39.145493 22694 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:39.259008 22718 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 1/3 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:39.359690 22826 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:40.065784 22718 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 2/3 log segments. Stats: ops{read=9385 overwritten=0 applied=9383 ignored=0} inserts{seen=469000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:40.175345 22318 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 2/3 log segments. Stats: ops{read=9243 overwritten=0 applied=9242 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:40.377732 22718 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 3/3 log segments. Stats: ops{read=11234 overwritten=0 applied=11233 ignored=0} inserts{seen=561450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:40.378189 22718 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap complete.
I20251024 08:16:40.382470 22718 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent bootstrapping tablet: real 2.061s	user 1.711s	sys 0.320s
I20251024 08:16:40.383574 22718 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.384186 22718 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Initialized, Role: FOLLOWER
I20251024 08:16:40.384385 22718 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11233, Last appended: 7.11234, Last appended by leader: 11234, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.384591 22718 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.001s
I20251024 08:16:40.403923 22453 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 2/3 log segments. Stats: ops{read=9374 overwritten=0 applied=9373 ignored=0} inserts{seen=468500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20251024 08:16:40.579902 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:40.609078 22318 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 3/3 log segments. Stats: ops{read=11234 overwritten=0 applied=11233 ignored=0} inserts{seen=561450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:16:40.609505 22318 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap complete.
I20251024 08:16:40.614082 22318 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent bootstrapping tablet: real 2.847s	user 2.433s	sys 0.395s
I20251024 08:16:40.614980 22318 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 8 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.615525 22318 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Initialized, Role: FOLLOWER
I20251024 08:16:40.615646 22318 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11233, Last appended: 7.11234, Last appended by leader: 11234, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.615880 22318 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.000s
W20251024 08:16:40.639189 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:40.669276 22340 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:40.724176 22453 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 3/3 log segments. Stats: ops{read=11233 overwritten=0 applied=11233 ignored=0} inserts{seen=561450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:40.724633 22453 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap complete.
I20251024 08:16:40.728683 22453 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent bootstrapping tablet: real 2.783s	user 2.312s	sys 0.431s
I20251024 08:16:40.729573 22453 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.729794 22453 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Initialized, Role: FOLLOWER
I20251024 08:16:40.729939 22453 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11233, Last appended: 7.11233, Last appended by leader: 11233, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.730187 22453 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent starting tablet: real 0.001s	user 0.005s	sys 0.000s
W20251024 08:16:40.730297 22340 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:40.748572 22870 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:40.748746 22870 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.749100 22870 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 9 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:40.752959 22380 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 9 candidate_status { last_received { term: 7 index: 11234 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
I20251024 08:16:40.753003 22515 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 9 candidate_status { last_received { term: 7 index: 11234 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:16:40.753131 22515 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 8.
I20251024 08:16:40.753131 22380 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 8 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 8.
I20251024 08:16:40.753325 22712 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 9 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:40.753436 22870 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Leader pre-election won for term 9
I20251024 08:16:40.753500 22870 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:40.753525 22870 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 8 FOLLOWER]: Advancing to term 9
I20251024 08:16:40.754417 22870 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.754527 22870 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 9 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:40.754748 22380 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 9 candidate_status { last_received { term: 7 index: 11234 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:40.754748 22515 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 9 candidate_status { last_received { term: 7 index: 11234 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:40.754825 22380 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 8 FOLLOWER]: Advancing to term 9
I20251024 08:16:40.754825 22515 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 8 FOLLOWER]: Advancing to term 9
I20251024 08:16:40.756003 22515 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 9 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 9.
I20251024 08:16:40.756002 22380 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 9 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 9.
I20251024 08:16:40.756168 22712 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 9 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 
I20251024 08:16:40.756264 22870 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 FOLLOWER]: Leader election won for term 9
I20251024 08:16:40.756397 22870 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 LEADER]: Becoming Leader. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Running, Role: LEADER
I20251024 08:16:40.756482 22870 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11233, Committed index: 11233, Last appended: 7.11234, Last appended by leader: 11234, Current term: 9, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:40.757180 19911 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 reported cstate change: term changed from 7 to 9. New cstate: current_term: 9 leader_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: UNKNOWN } } }
W20251024 08:16:40.762192 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:40.821288 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:40.844677 22515 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 9 FOLLOWER]: Refusing update from remote peer e9ac8f0e11a34e5fb1c19a793f211a56: Log matching property violated. Preceding OpId in replica: term: 7 index: 11233. Preceding OpId from leader: term: 9 index: 11235. (index mismatch)
I20251024 08:16:40.845057 22870 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11235, Last known committed idx: 11233, Time since last communication: 0.000s
I20251024 08:16:40.846884 22380 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 9 FOLLOWER]: Refusing update from remote peer e9ac8f0e11a34e5fb1c19a793f211a56: Log matching property violated. Preceding OpId in replica: term: 7 index: 11234. Preceding OpId from leader: term: 9 index: 11235. (index mismatch)
I20251024 08:16:40.847440 22878 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Connected to new peer: Peer: permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11235, Last known committed idx: 11233, Time since last communication: 0.000s
W20251024 08:16:40.847792 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:40.849243 22340 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:41.500329 20542 scanner-internal.cc:458] Time spent opening tablet: real 4.007s	user 0.001s	sys 0.000s
W20251024 08:16:41.501822 20543 scanner-internal.cc:458] Time spent opening tablet: real 4.005s	user 0.001s	sys 0.000s
W20251024 08:16:41.617588 20544 scanner-internal.cc:458] Time spent opening tablet: real 4.012s	user 0.001s	sys 0.000s
I20251024 08:16:43.768792 22360 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:43.770197 22492 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:43.779569 22628 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:43.787132 22756 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251024 08:16:44.156134 19914 ts_manager.cc:284] Unset tserver state for 36b0ebc9a5694a778497ec8d94aba993 from MAINTENANCE_MODE
I20251024 08:16:44.156922 22694 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:44.165602 19915 ts_manager.cc:284] Unset tserver state for 773ff64ed1b249db9be71c247d7cbf43 from MAINTENANCE_MODE
I20251024 08:16:44.205154 19915 ts_manager.cc:284] Unset tserver state for 97d4708eb2b64571b34044be6da3d298 from MAINTENANCE_MODE
I20251024 08:16:44.209008 19915 ts_manager.cc:284] Unset tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 from MAINTENANCE_MODE
I20251024 08:16:44.488008 19915 ts_manager.cc:295] Set tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 to MAINTENANCE_MODE
I20251024 08:16:44.490581 19915 ts_manager.cc:295] Set tserver state for 773ff64ed1b249db9be71c247d7cbf43 to MAINTENANCE_MODE
I20251024 08:16:44.583679 19915 ts_manager.cc:295] Set tserver state for 97d4708eb2b64571b34044be6da3d298 to MAINTENANCE_MODE
I20251024 08:16:44.688220 19915 ts_manager.cc:295] Set tserver state for 36b0ebc9a5694a778497ec8d94aba993 to MAINTENANCE_MODE
I20251024 08:16:44.744900 22360 tablet_service.cc:1460] Tablet server 773ff64ed1b249db9be71c247d7cbf43 set to quiescing
I20251024 08:16:44.744977 22360 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:44.834545 22756 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:44.834611 22756 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251024 08:16:44.836138 22898 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: : Instructing follower 97d4708eb2b64571b34044be6da3d298 to start an election
I20251024 08:16:44.836194 22898 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 LEADER]: Signalling peer 97d4708eb2b64571b34044be6da3d298 to start an election
I20251024 08:16:44.836690 22515 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "97d4708eb2b64571b34044be6da3d298"
 from {username='slave'} at 127.18.80.67:51325
I20251024 08:16:44.836784 22515 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 9 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:44.836817 22515 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 9 FOLLOWER]: Advancing to term 10
I20251024 08:16:44.837632 22515 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 10 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:44.837878 22515 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 10 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:44.838760 22514 raft_consensus.cc:1240] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 10 FOLLOWER]: Rejecting Update request from peer e9ac8f0e11a34e5fb1c19a793f211a56 for earlier term 9. Current term is 10. Ops: [9.14898-9.14898]
I20251024 08:16:44.839190 22947 consensus_queue.cc:1059] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: INVALID_TERM, Last received: 9.14897, Next index: 14898, Last known committed idx: 14897, Time since last communication: 0.000s
I20251024 08:16:44.839296 22947 raft_consensus.cc:3055] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 LEADER]: Stepping down as leader of term 9
I20251024 08:16:44.839323 22947 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 LEADER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Running, Role: LEADER
I20251024 08:16:44.839370 22947 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 14899, Committed index: 14899, Last appended: 9.14899, Last appended by leader: 14899, Current term: 9, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:44.839452 22947 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 9 FOLLOWER]: Advancing to term 10
W20251024 08:16:44.840813 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.840960 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.841984 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.843861 22340 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.844048 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.844157 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.846117 22780 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 10 candidate_status { last_received { term: 9 index: 14897 } } ignore_live_leader: true dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:16:44.846244 22780 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 97d4708eb2b64571b34044be6da3d298 for term 10 because replica has last-logged OpId of term: 9 index: 14899, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 14897.
W20251024 08:16:44.846855 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.847496 22380 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 10 candidate_status { last_received { term: 9 index: 14897 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:44.847579 22380 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 9 FOLLOWER]: Advancing to term 10
I20251024 08:16:44.848383 22380 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 97d4708eb2b64571b34044be6da3d298 for term 10 because replica has last-logged OpId of term: 9 index: 14899, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 14897.
I20251024 08:16:44.848642 22447 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 10 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298; no voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56
I20251024 08:16:44.848965 23065 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 10 FOLLOWER]: Leader election lost for term 10. Reason: could not achieve majority
W20251024 08:16:44.850585 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.851213 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.854064 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.855111 22426 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:44.855108 22561 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
W20251024 08:16:44.856114 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.856269 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.858695 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.860607 22826 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
W20251024 08:16:44.863826 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.864351 22492 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
W20251024 08:16:44.864413 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.864418 22492 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:44.864854 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.869449 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.875000 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.875689 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.881613 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.884204 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.884268 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.892828 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.893361 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.893806 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.903997 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.904489 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.906013 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.917302 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.917788 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.919847 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.930495 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.931382 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.931411 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.943099 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.946719 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.947793 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.958158 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.960130 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:44.961359 22628 tablet_service.cc:1460] Tablet server 36b0ebc9a5694a778497ec8d94aba993 set to quiescing
I20251024 08:16:44.961422 22628 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:44.964395 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.973762 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.975842 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.979418 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.993043 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.994534 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:44.996421 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.011683 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.015112 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.016095 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.031334 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.036674 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.036689 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.050316 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.056454 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.058992 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.073108 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.078095 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.080049 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.096582 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.101496 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.103101 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.120762 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.124368 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.127357 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.141618 23083 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:45.147490 22740 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.149509 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.153959 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:45.157877 22694 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
W20251024 08:16:45.163756 23065 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:45.172081 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.172590 22947 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:45.173945 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.182529 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.198155 22457 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.198362 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.209362 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.223434 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.224421 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.237977 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.250038 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.250039 22338 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.266685 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.277253 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.279739 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.296054 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.305997 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.312227 22739 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.327728 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.335680 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.340867 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.358469 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.365984 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.374564 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.391569 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.399706 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.407946 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.426124 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.433324 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.443634 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.461447 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.468216 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.479079 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.496079 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.506315 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.514575 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.531375 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.543788 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.550977 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.570453 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.580322 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.591151 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.612684 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.621914 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.629299 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.653364 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.660544 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.669821 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.696676 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.699411 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.712324 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.741844 22736 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.741844 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.754002 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.783962 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.785835 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.798153 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.828230 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.830833 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.842482 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.874012 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.874737 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.890184 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.918833 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.921955 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.934259 22339 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47918: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.965668 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.969449 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:45.981120 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:45.981467 22756 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:45.981521 22756 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:46.013514 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.019586 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.031733 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:46.037071 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 22296
I20251024 08:16:46.047569 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:39115
--local_ip_for_outbound_sockets=127.18.80.66
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=35129
--webserver_interface=127.18.80.66
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:46.061059 20512 meta_cache.cc:302] tablet 29f8dacedea249e9883a604ac785b905: replica 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115) has failed: Network error: Client connection negotiation failed: client connection to 127.18.80.66:39115: connect: Connection refused (error 111)
W20251024 08:16:46.069020 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.082062 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.111003 22475 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57304: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.118405 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.125562 23095 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:46.125722 23095 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:46.125741 23095 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:46.127118 23095 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:46.127174 23095 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:46.128664 23095 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:39115
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=35129
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.23095
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:46.128912 23095 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:46.129169 23095 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:46.131886 23102 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:46.131897 23101 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.131938 23095 server_base.cc:1047] running on GCE node
W20251024 08:16:46.132109 23104 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.132277 23095 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:46.132462 23095 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:46.133595 23095 hybrid_clock.cc:648] HybridClock initialized: now 1761293806133578 us; error 34 us; skew 500 ppm
I20251024 08:16:46.134694 23095 webserver.cc:492] Webserver started at http://127.18.80.66:35129/ using document root <none> and password file <none>
I20251024 08:16:46.134866 23095 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:46.134902 23095 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:46.135952 23095 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:46.136587 23110 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:46.136785 23095 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:46.136866 23095 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:46.137182 23095 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:46.153107 23095 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:46.153358 23095 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:46.153440 23095 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:46.153626 23095 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:46.154050 23117 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:46.154881 23095 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:46.154927 23095 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:46.154953 23095 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:46.155488 23095 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:46.155517 23095 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:46.155586 23117 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap starting.
I20251024 08:16:46.161319 23095 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:39115
I20251024 08:16:46.161661 23224 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:39115 every 8 connection(s)
I20251024 08:16:46.161803 23095 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:46.161978 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 23095
W20251024 08:16:46.161975 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:46.162093 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 22430
I20251024 08:16:46.170166 23225 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:46.170284 23225 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:46.170493 23225 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:46.170995 19914 ts_manager.cc:194] Re-registered known tserver with Master: 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115)
I20251024 08:16:46.171397 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:36123
I20251024 08:16:46.177335 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:35069
--local_ip_for_outbound_sockets=127.18.80.65
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=40139
--webserver_interface=127.18.80.65
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:46.190827 23117 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:46.234230 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.260185 23230 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:46.260444 23230 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:46.260497 23230 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:46.262730 23230 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:46.262813 23230 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:46.264483 23230 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:35069
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=40139
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.23230
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:46.264693 23230 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:46.264956 23230 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:46.267413 23237 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:46.267429 23236 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.267654 23230 server_base.cc:1047] running on GCE node
W20251024 08:16:46.267786 23239 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.267966 23230 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:46.268198 23230 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:46.269344 23230 hybrid_clock.cc:648] HybridClock initialized: now 1761293806269333 us; error 26 us; skew 500 ppm
I20251024 08:16:46.270495 23230 webserver.cc:492] Webserver started at http://127.18.80.65:40139/ using document root <none> and password file <none>
I20251024 08:16:46.270727 23230 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:46.270768 23230 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:46.271962 23230 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:46.272665 23245 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:46.272907 23230 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:46.272997 23230 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:46.273348 23230 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251024 08:16:46.285597 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:46.289144 23230 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:46.289428 23230 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:46.289573 23230 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:46.289813 23230 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:46.290323 23252 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:46.291244 23230 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:46.291308 23230 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:46.291344 23230 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:46.292114 23230 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:46.292160 23230 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:46.292219 23252 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap starting.
I20251024 08:16:46.299402 23230 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:35069
I20251024 08:16:46.299527 23359 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:35069 every 8 connection(s)
I20251024 08:16:46.299990 23230 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:46.304061 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 23230
I20251024 08:16:46.304175 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 22564
I20251024 08:16:46.311473 23360 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:46.311614 23360 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:46.311864 23360 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:46.313035 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:34051
--local_ip_for_outbound_sockets=127.18.80.68
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=33347
--webserver_interface=127.18.80.68
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:46.313679 19914 ts_manager.cc:194] Re-registered known tserver with Master: 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:46.314186 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:48729
I20251024 08:16:46.330130 23252 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:46.379456 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.393792 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:46.428884 23363 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:46.429148 23363 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:46.429183 23363 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:46.431497 23363 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:46.431591 23363 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:46.434134 23363 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:34051
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=33347
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.23363
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:46.434405 23363 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:46.434705 23363 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:46.437674 23370 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:46.437671 23371 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.438278 23363 server_base.cc:1047] running on GCE node
W20251024 08:16:46.441017 23373 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.441280 23363 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:46.441550 23363 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:46.442720 23363 hybrid_clock.cc:648] HybridClock initialized: now 1761293806442695 us; error 42 us; skew 500 ppm
I20251024 08:16:46.444260 23363 webserver.cc:492] Webserver started at http://127.18.80.68:33347/ using document root <none> and password file <none>
I20251024 08:16:46.444504 23363 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:46.444566 23363 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:46.446159 23363 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:46.446889 23379 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:46.447072 23363 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:46.447150 23363 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:46.447465 23363 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251024 08:16:46.448621 22735 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33700: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:46.467617 23363 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:46.467913 23363 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:46.468050 23363 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:46.468307 23363 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:46.468694 23363 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:46.468739 23363 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:46.468771 23363 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:46.468796 23363 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:46.475927 23363 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:34051
I20251024 08:16:46.476073 23492 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:34051 every 8 connection(s)
I20251024 08:16:46.476351 23363 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:46.479688 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 23363
I20251024 08:16:46.479789 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 22697
I20251024 08:16:46.486905 23493 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:46.487130 23493 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:46.487452 23493 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:46.487917 19914 ts_manager.cc:194] Re-registered known tserver with Master: 36b0ebc9a5694a778497ec8d94aba993 (127.18.80.68:34051)
I20251024 08:16:46.488476 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:33845
I20251024 08:16:46.497917 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:40981
--local_ip_for_outbound_sockets=127.18.80.67
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=43985
--webserver_interface=127.18.80.67
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:46.615305 23496 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:46.615634 23496 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:46.615713 23496 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:46.618152 23496 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:46.618314 23496 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:46.620909 23496 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:40981
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=43985
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.23496
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:46.621212 23496 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:46.621507 23496 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:46.624413 23502 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:46.624408 23501 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:46.624511 23504 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:46.625056 23496 server_base.cc:1047] running on GCE node
I20251024 08:16:46.625236 23496 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:46.625478 23496 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:46.626645 23496 hybrid_clock.cc:648] HybridClock initialized: now 1761293806626619 us; error 39 us; skew 500 ppm
I20251024 08:16:46.628597 23496 webserver.cc:492] Webserver started at http://127.18.80.67:43985/ using document root <none> and password file <none>
I20251024 08:16:46.628858 23496 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:46.629236 23496 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:46.631065 23496 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:46.631944 23510 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:46.632128 23496 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:46.632198 23496 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:46.632543 23496 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:46.655951 23496 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:46.656283 23496 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:46.656423 23496 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:46.656680 23496 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:46.657277 23517 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:46.658425 23496 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:46.658484 23496 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:46.658522 23496 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:46.659255 23496 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:46.659300 23496 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:46.659454 23517 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap starting.
I20251024 08:16:46.666126 23496 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:40981
I20251024 08:16:46.666548 23496 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:46.670747 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 23496
I20251024 08:16:46.677793 23624 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:40981 every 8 connection(s)
I20251024 08:16:46.701788 23625 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:46.701910 23625 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:46.702198 23625 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:46.702843 19914 ts_manager.cc:194] Re-registered known tserver with Master: e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:46.703399 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:51313
I20251024 08:16:46.706308 23517 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:46.857826 23427 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:46.861073 23294 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:46.866147 23159 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:46.868194 23559 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:47.172174 23225 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:47.191529 23117 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:47.315622 23360 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:47.481731 23252 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:47.491093 23493 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:47.534842 23517 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:47.704241 23625 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:48.326161 23517 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 2/4 log segments. Stats: ops{read=9243 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:48.414094 23117 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 2/4 log segments. Stats: ops{read=9243 overwritten=0 applied=9240 ignored=0} inserts{seen=461850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:48.704934 23252 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 2/4 log segments. Stats: ops{read=9243 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:49.130874 23517 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 3/4 log segments. Stats: ops{read=13867 overwritten=0 applied=13864 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:49.323364 23517 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 4/4 log segments. Stats: ops{read=14899 overwritten=0 applied=14899 ignored=0} inserts{seen=744700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:49.323904 23517 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap complete.
I20251024 08:16:49.329978 23517 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent bootstrapping tablet: real 2.671s	user 2.246s	sys 0.387s
I20251024 08:16:49.330968 23517 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:49.331178 23517 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Initialized, Role: FOLLOWER
I20251024 08:16:49.331291 23517 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14899, Last appended: 9.14899, Last appended by leader: 14899, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:49.331585 23517 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
W20251024 08:16:49.354912 23538 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:49.440665 23538 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:49.518806 23538 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:49.566412 20544 scanner-internal.cc:458] Time spent opening tablet: real 4.007s	user 0.001s	sys 0.000s
I20251024 08:16:49.587978 23117 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 3/4 log segments. Stats: ops{read=13993 overwritten=0 applied=13991 ignored=0} inserts{seen=699300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20251024 08:16:49.645673 23539 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:49.657440 23670 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:49.657608 23670 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:49.657919 23670 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:49.676247 23179 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 11 candidate_status { last_received { term: 9 index: 14899 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
W20251024 08:16:49.677789 23511 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:49.682845 23314 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 11 candidate_status { last_received { term: 9 index: 14899 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
W20251024 08:16:49.684015 23511 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:49.684099 23511 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:49.684255 23670 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
W20251024 08:16:49.736294 23539 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:49.812331 23539 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
I20251024 08:16:49.855568 23117 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 4/4 log segments. Stats: ops{read=14899 overwritten=0 applied=14899 ignored=0} inserts{seen=744700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:49.856295 23117 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap complete.
I20251024 08:16:49.864527 23117 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent bootstrapping tablet: real 3.709s	user 3.207s	sys 0.466s
I20251024 08:16:49.865726 23117 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:49.866111 23117 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Initialized, Role: FOLLOWER
I20251024 08:16:49.866261 23117 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14899, Last appended: 9.14899, Last appended by leader: 14899, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:49.866554 23117 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251024 08:16:49.930491 23252 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 3/4 log segments. Stats: ops{read=14003 overwritten=0 applied=14001 ignored=0} inserts{seen=699800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
W20251024 08:16:49.942291 23538 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:50.017819 23139 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47976: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:50.040930 23538 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33792: Illegal state: replica e9ac8f0e11a34e5fb1c19a793f211a56 is not leader of this config: current role FOLLOWER
W20251024 08:16:50.042833 23139 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47976: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:50.077003 23670 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:50.077104 23670 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:50.077273 23670 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:50.077486 23314 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 11 candidate_status { last_received { term: 9 index: 14899 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
W20251024 08:16:50.077689 23511 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:50.077687 23179 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 11 candidate_status { last_received { term: 9 index: 14899 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43" is_pre_election: true
I20251024 08:16:50.077862 23179 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 10 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 10.
I20251024 08:16:50.077997 23511 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:50.078094 23670 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Leader pre-election won for term 11
I20251024 08:16:50.078130 23670 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:16:50.078163 23670 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 10 FOLLOWER]: Advancing to term 11
I20251024 08:16:50.079137 23670 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:50.079244 23670 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:50.079421 23314 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 11 candidate_status { last_received { term: 9 index: 14899 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:16:50.079439 23179 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" candidate_term: 11 candidate_status { last_received { term: 9 index: 14899 } } ignore_live_leader: false dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:50.079521 23179 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 10 FOLLOWER]: Advancing to term 11
W20251024 08:16:50.079594 23511 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 election: Tablet error from VoteRequest() call to peer 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:50.080650 23179 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate e9ac8f0e11a34e5fb1c19a793f211a56 in term 11.
I20251024 08:16:50.080803 23511 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [CANDIDATE]: Term 11 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56; no voters: 97d4708eb2b64571b34044be6da3d298
I20251024 08:16:50.080904 23670 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 FOLLOWER]: Leader election won for term 11
I20251024 08:16:50.081024 23670 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 LEADER]: Becoming Leader. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Running, Role: LEADER
I20251024 08:16:50.081105 23670 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14899, Committed index: 14899, Last appended: 9.14899, Last appended by leader: 14899, Current term: 11, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:50.081754 19914 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 reported cstate change: term changed from 9 to 11. New cstate: current_term: 11 leader_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: UNKNOWN } } }
I20251024 08:16:50.122820 23179 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 11 FOLLOWER]: Refusing update from remote peer e9ac8f0e11a34e5fb1c19a793f211a56: Log matching property violated. Preceding OpId in replica: term: 9 index: 14899. Preceding OpId from leader: term: 11 index: 14901. (index mismatch)
W20251024 08:16:50.122841 23511 consensus_peers.cc:597] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 -> Peer 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069): Couldn't send request to peer 97d4708eb2b64571b34044be6da3d298. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20251024 08:16:50.123121 23670 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Connected to new peer: Peer: permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14900, Last known committed idx: 14899, Time since last communication: 0.000s
I20251024 08:16:50.125025 23686 mvcc.cc:204] Tried to move back new op lower bound from 7214259446260690944 to 7214259446092558336. Current Snapshot: MvccSnapshot[applied={T|T < 7214259446260690944}]
I20251024 08:16:50.125058 23685 mvcc.cc:204] Tried to move back new op lower bound from 7214259446260690944 to 7214259446092558336. Current Snapshot: MvccSnapshot[applied={T|T < 7214259446260690944}]
I20251024 08:16:50.135538 23252 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 4/4 log segments. Stats: ops{read=14897 overwritten=0 applied=14897 ignored=0} inserts{seen=744600 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251024 08:16:50.136245 23252 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap complete.
W20251024 08:16:50.143532 23139 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47976: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:50.144618 23252 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent bootstrapping tablet: real 3.852s	user 3.285s	sys 0.555s
W20251024 08:16:50.145051 23139 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:47976: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:50.145666 23252 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:50.145874 23252 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Initialized, Role: FOLLOWER
I20251024 08:16:50.146000 23252 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14897, Last appended: 9.14897, Last appended by leader: 14897, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:50.146250 23252 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
W20251024 08:16:50.149897 23274 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57356: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:16:50.209905 23314 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 10 FOLLOWER]: Advancing to term 11
I20251024 08:16:50.211267 23314 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 11 FOLLOWER]: Refusing update from remote peer e9ac8f0e11a34e5fb1c19a793f211a56: Log matching property violated. Preceding OpId in replica: term: 9 index: 14897. Preceding OpId from leader: term: 9 index: 14899. (index mismatch)
I20251024 08:16:50.211578 23690 consensus_queue.cc:1050] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14900, Last known committed idx: 14897, Time since last communication: 0.000s
I20251024 08:16:50.217948 23693 mvcc.cc:204] Tried to move back new op lower bound from 7214259446618120192 to 7214259446092558336. Current Snapshot: MvccSnapshot[applied={T|T < 7214259446292246528 or (T in {7214259446299619328,7214259446306996224})}]
W20251024 08:16:50.245649 23274 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57356: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:50.247242 23274 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:57356: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:16:51.203024 20543 scanner-internal.cc:458] Time spent opening tablet: real 5.708s	user 0.001s	sys 0.001s
W20251024 08:16:51.221828 20542 scanner-internal.cc:458] Time spent opening tablet: real 5.706s	user 0.001s	sys 0.000s
I20251024 08:16:52.167510 23427 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:52.167706 23294 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:52.169497 23159 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:52.183959 23559 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251024 08:16:52.484148 19914 ts_manager.cc:284] Unset tserver state for 97d4708eb2b64571b34044be6da3d298 from MAINTENANCE_MODE
I20251024 08:16:52.495031 23493 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:52.584937 19914 ts_manager.cc:284] Unset tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 from MAINTENANCE_MODE
I20251024 08:16:52.591833 19914 ts_manager.cc:284] Unset tserver state for 773ff64ed1b249db9be71c247d7cbf43 from MAINTENANCE_MODE
I20251024 08:16:52.617154 19914 ts_manager.cc:284] Unset tserver state for 36b0ebc9a5694a778497ec8d94aba993 from MAINTENANCE_MODE
I20251024 08:16:52.909700 19914 ts_manager.cc:295] Set tserver state for 773ff64ed1b249db9be71c247d7cbf43 to MAINTENANCE_MODE
I20251024 08:16:53.038638 19914 ts_manager.cc:295] Set tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 to MAINTENANCE_MODE
I20251024 08:16:53.118058 19914 ts_manager.cc:295] Set tserver state for 97d4708eb2b64571b34044be6da3d298 to MAINTENANCE_MODE
I20251024 08:16:53.126919 23225 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:53.163575 19914 ts_manager.cc:295] Set tserver state for 36b0ebc9a5694a778497ec8d94aba993 to MAINTENANCE_MODE
I20251024 08:16:53.191170 23159 tablet_service.cc:1460] Tablet server 773ff64ed1b249db9be71c247d7cbf43 set to quiescing
I20251024 08:16:53.191239 23159 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:53.217209 23360 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:53.229508 23625 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:53.466795 23559 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:53.466859 23559 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251024 08:16:53.471237 23427 tablet_service.cc:1460] Tablet server 36b0ebc9a5694a778497ec8d94aba993 set to quiescing
I20251024 08:16:53.471309 23427 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:53.471812 23820 raft_consensus.cc:993] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: : Instructing follower 97d4708eb2b64571b34044be6da3d298 to start an election
I20251024 08:16:53.471896 23820 raft_consensus.cc:1081] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 LEADER]: Signalling peer 97d4708eb2b64571b34044be6da3d298 to start an election
I20251024 08:16:53.472378 23313 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "29f8dacedea249e9883a604ac785b905"
dest_uuid: "97d4708eb2b64571b34044be6da3d298"
 from {username='slave'} at 127.18.80.67:43961
I20251024 08:16:53.472581 23313 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 11 FOLLOWER]: Starting forced leader election (received explicit request)
I20251024 08:16:53.472723 23313 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 11 FOLLOWER]: Advancing to term 12
I20251024 08:16:53.473809 23313 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:53.474490 23314 raft_consensus.cc:1240] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Rejecting Update request from peer e9ac8f0e11a34e5fb1c19a793f211a56 for earlier term 11. Current term is 12. Ops: [11.17757-11.17758]
I20251024 08:16:53.474709 23709 consensus_queue.cc:1059] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: INVALID_TERM, Last received: 11.17756, Next index: 17757, Last known committed idx: 17755, Time since last communication: 0.000s
I20251024 08:16:53.474869 23703 raft_consensus.cc:3055] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 LEADER]: Stepping down as leader of term 11
I20251024 08:16:53.474900 23703 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 LEADER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Running, Role: LEADER
I20251024 08:16:53.474952 23703 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 17756, Committed index: 17756, Last appended: 11.17759, Last appended by leader: 17759, Current term: 11, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:53.475425 23703 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 11 FOLLOWER]: Advancing to term 12
W20251024 08:16:53.475427 23820 consensus_queue.cc:1175] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue is closed or peer was untracked, disregarding peer response. Response: responder_uuid: "773ff64ed1b249db9be71c247d7cbf43" responder_term: 11 status { last_received { term: 11 index: 17758 } last_committed_idx: 17756 last_received_current_leader { term: 11 index: 17758 } } server_quiescing: true
I20251024 08:16:53.476586 23313 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 12 election: Requested vote from peers 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115), e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:53.481163 23179 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 12 candidate_status { last_received { term: 11 index: 17756 } } ignore_live_leader: true dest_uuid: "773ff64ed1b249db9be71c247d7cbf43"
I20251024 08:16:53.481256 23179 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 11 FOLLOWER]: Advancing to term 12
I20251024 08:16:53.482050 23179 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate 97d4708eb2b64571b34044be6da3d298 for term 12 because replica has last-logged OpId of term: 11 index: 17758, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 17756.
I20251024 08:16:53.482614 23579 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "97d4708eb2b64571b34044be6da3d298" candidate_term: 12 candidate_status { last_received { term: 11 index: 17756 } } ignore_live_leader: true dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:16:53.482761 23579 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate 97d4708eb2b64571b34044be6da3d298 for term 12 because replica has last-logged OpId of term: 11 index: 17759, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 17756.
I20251024 08:16:53.482949 23249 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [CANDIDATE]: Term 12 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 97d4708eb2b64571b34044be6da3d298; no voters: 773ff64ed1b249db9be71c247d7cbf43, e9ac8f0e11a34e5fb1c19a793f211a56
I20251024 08:16:53.483165 23877 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Leader election lost for term 12. Reason: could not achieve majority
I20251024 08:16:53.496009 23493 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:53.511672 23294 tablet_service.cc:1460] Tablet server 97d4708eb2b64571b34044be6da3d298 set to quiescing
I20251024 08:16:53.511750 23294 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251024 08:16:53.750665 23709 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:53.774885 23888 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: failed to trigger leader election: Illegal state: leader elections are disabled
W20251024 08:16:53.867918 23877 raft_consensus.cc:670] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: failed to trigger leader election: Illegal state: leader elections are disabled
I20251024 08:16:54.645663 23559 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:54.645735 23559 tablet_service.cc:1467] Tablet server has 0 leaders and 1 scanners
I20251024 08:16:55.787614 23559 tablet_service.cc:1460] Tablet server e9ac8f0e11a34e5fb1c19a793f211a56 set to quiescing
I20251024 08:16:55.787684 23559 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:55.843322 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 23095
I20251024 08:16:55.855134 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.66:39115
--local_ip_for_outbound_sockets=127.18.80.66
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=35129
--webserver_interface=127.18.80.66
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:55.933967 23911 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:55.934170 23911 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:55.934190 23911 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:55.935546 23911 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:55.935595 23911 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.66
I20251024 08:16:55.937016 23911 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.66:39115
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.18.80.66
--webserver_port=35129
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.23911
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.66
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:55.937196 23911 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:55.937392 23911 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:55.940183 23919 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:55.940210 23916 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:55.940237 23917 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:55.940304 23911 server_base.cc:1047] running on GCE node
I20251024 08:16:55.940455 23911 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:55.940627 23911 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:55.941745 23911 hybrid_clock.cc:648] HybridClock initialized: now 1761293815941735 us; error 34 us; skew 500 ppm
I20251024 08:16:55.942858 23911 webserver.cc:492] Webserver started at http://127.18.80.66:35129/ using document root <none> and password file <none>
I20251024 08:16:55.943095 23911 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:55.943140 23911 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:55.944350 23911 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:55.944999 23925 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:55.945148 23911 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:55.945209 23911 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
uuid: "773ff64ed1b249db9be71c247d7cbf43"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:55.945504 23911 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:55.956516 23911 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:55.956835 23911 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:55.957022 23911 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:55.957290 23911 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:55.957707 23932 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:55.958557 23911 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:55.958597 23911 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:55.958633 23911 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:55.959143 23911 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:55.959174 23911 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:55.959244 23932 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap starting.
I20251024 08:16:55.966324 23911 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.66:39115
I20251024 08:16:55.966409 24039 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.66:39115 every 8 connection(s)
I20251024 08:16:55.966708 23911 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-1/data/info.pb
I20251024 08:16:55.970147 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 23911
I20251024 08:16:55.970255 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 23230
I20251024 08:16:55.971904 24040 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:55.972007 24040 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:55.972237 24040 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:55.972784 19914 ts_manager.cc:194] Re-registered known tserver with Master: 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66:39115)
I20251024 08:16:55.973403 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.66:39219
W20251024 08:16:55.982038 20512 connection.cc:537] client connection to 127.18.80.65:35069 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251024 08:16:55.982227 20543 meta_cache.cc:1510] marking tablet server 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069) as failed
W20251024 08:16:55.982308 20543 meta_cache.cc:302] tablet 29f8dacedea249e9883a604ac785b905: replica 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069) has failed: Network error: TS failed: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251024 08:16:55.982234 20542 meta_cache.cc:1510] marking tablet server 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069) as failed
I20251024 08:16:55.982571 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.65:35069
--local_ip_for_outbound_sockets=127.18.80.65
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=40139
--webserver_interface=127.18.80.65
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:56.029328 23932 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:56.064639 24045 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:56.064812 24045 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:56.064831 24045 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:56.066237 24045 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:56.066298 24045 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.65
I20251024 08:16:56.067700 24045 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.65:35069
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.18.80.65
--webserver_port=40139
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.24045
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.65
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:56.067938 24045 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:56.068163 24045 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:56.070626 24053 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:56.070621 24052 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:56.070863 24045 server_base.cc:1047] running on GCE node
W20251024 08:16:56.070938 24055 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:56.071134 24045 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:56.071331 24045 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:56.072466 24045 hybrid_clock.cc:648] HybridClock initialized: now 1761293816072452 us; error 34 us; skew 500 ppm
I20251024 08:16:56.073710 24045 webserver.cc:492] Webserver started at http://127.18.80.65:40139/ using document root <none> and password file <none>
I20251024 08:16:56.073936 24045 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:56.073984 24045 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:56.075143 24045 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251024 08:16:56.075825 24061 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:56.075994 24045 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251024 08:16:56.076067 24045 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
uuid: "97d4708eb2b64571b34044be6da3d298"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:56.076326 24045 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:56.083527 24045 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:56.083762 24045 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:56.083870 24045 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:56.084045 24045 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:56.084477 24068 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:56.085350 24045 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:56.085402 24045 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:56.085426 24045 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:56.085930 24045 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:56.085970 24045 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:56.086035 24068 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap starting.
I20251024 08:16:56.094035 24045 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.65:35069
I20251024 08:16:56.094089 24175 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.65:35069 every 8 connection(s)
I20251024 08:16:56.094614 24045 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-0/data/info.pb
I20251024 08:16:56.101008 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 24045
I20251024 08:16:56.101135 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 23363
I20251024 08:16:56.106707 24176 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:56.106900 24176 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:56.107225 24176 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:56.107825 19914 ts_manager.cc:194] Re-registered known tserver with Master: 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:56.108346 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.65:60025
I20251024 08:16:56.109654 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.68:34051
--local_ip_for_outbound_sockets=127.18.80.68
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=33347
--webserver_interface=127.18.80.68
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251024 08:16:56.193384 24068 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Log is configured to *not* fsync() on all Append() calls
W20251024 08:16:56.226140 24179 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:56.226428 24179 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:56.226517 24179 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:56.228404 24179 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:56.228480 24179 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.68
I20251024 08:16:56.230193 24179 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.68:34051
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.18.80.68
--webserver_port=33347
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.24179
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.68
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:56.230449 24179 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:56.230724 24179 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:56.233270 24186 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:56.233278 24185 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:56.233809 24188 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:56.233861 24179 server_base.cc:1047] running on GCE node
I20251024 08:16:56.234166 24179 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:56.234401 24179 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:56.235546 24179 hybrid_clock.cc:648] HybridClock initialized: now 1761293816235538 us; error 29 us; skew 500 ppm
I20251024 08:16:56.236779 24179 webserver.cc:492] Webserver started at http://127.18.80.68:33347/ using document root <none> and password file <none>
I20251024 08:16:56.237030 24179 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:56.237078 24179 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:56.238384 24179 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:56.238969 24194 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:56.239114 24179 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251024 08:16:56.239188 24179 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
uuid: "36b0ebc9a5694a778497ec8d94aba993"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:56.239440 24179 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:56.255074 24179 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:56.255350 24179 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:56.255475 24179 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:56.255726 24179 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:56.256094 24179 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251024 08:16:56.256134 24179 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:56.256166 24179 ts_tablet_manager.cc:616] Registered 0 tablets
I20251024 08:16:56.256220 24179 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:56.263087 24179 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.68:34051
I20251024 08:16:56.263128 24307 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.68:34051 every 8 connection(s)
I20251024 08:16:56.263513 24179 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-3/data/info.pb
I20251024 08:16:56.268023 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 24179
I20251024 08:16:56.268146 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 23496
I20251024 08:16:56.273578 24308 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:56.273711 24308 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:56.273993 24308 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:56.274508 19914 ts_manager.cc:194] Re-registered known tserver with Master: 36b0ebc9a5694a778497ec8d94aba993 (127.18.80.68:34051)
I20251024 08:16:56.275059 19914 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.68:59509
I20251024 08:16:56.282550 18753 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskNzynA4/build/release/bin/kudu
/tmp/dist-test-taskNzynA4/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.18.80.67:40981
--local_ip_for_outbound_sockets=127.18.80.67
--tserver_master_addrs=127.18.80.126:45025
--webserver_port=43985
--webserver_interface=127.18.80.67
--builtin_ntp_servers=127.18.80.84:38683
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251024 08:16:56.403156 24311 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251024 08:16:56.403410 24311 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251024 08:16:56.403450 24311 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251024 08:16:56.405799 24311 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251024 08:16:56.405902 24311 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.18.80.67
I20251024 08:16:56.408339 24311 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.18.80.84:38683
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.18.80.67:40981
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.18.80.67
--webserver_port=43985
--tserver_master_addrs=127.18.80.126:45025
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.24311
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.18.80.67
--log_dir=/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 8d75a6f6cef564a7be95d0a4dbcc58aa159edfc7
build type RELEASE
built by None at 24 Oct 2025 07:43:14 UTC on 5fd53c4cbb9d
build id 8691
I20251024 08:16:56.408666 24311 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251024 08:16:56.409045 24311 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251024 08:16:56.412073 24320 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:56.412178 24317 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251024 08:16:56.412397 24318 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251024 08:16:56.413340 24311 server_base.cc:1047] running on GCE node
I20251024 08:16:56.413535 24311 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251024 08:16:56.413789 24311 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251024 08:16:56.414955 24311 hybrid_clock.cc:648] HybridClock initialized: now 1761293816414941 us; error 28 us; skew 500 ppm
I20251024 08:16:56.416432 24311 webserver.cc:492] Webserver started at http://127.18.80.67:43985/ using document root <none> and password file <none>
I20251024 08:16:56.416673 24311 fs_manager.cc:362] Metadata directory not provided
I20251024 08:16:56.416723 24311 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251024 08:16:56.418329 24311 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:56.419251 24326 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251024 08:16:56.419464 24311 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251024 08:16:56.419551 24311 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data,/tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
format_stamp: "Formatted at 2025-10-24 08:16:18 on dist-test-slave-13l5"
I20251024 08:16:56.419889 24311 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251024 08:16:56.435582 24311 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251024 08:16:56.435916 24311 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251024 08:16:56.436072 24311 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251024 08:16:56.436348 24311 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251024 08:16:56.436983 24333 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251024 08:16:56.438143 24311 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251024 08:16:56.438206 24311 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:56.438246 24311 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251024 08:16:56.438983 24311 ts_tablet_manager.cc:616] Registered 1 tablets
I20251024 08:16:56.439054 24311 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251024 08:16:56.439102 24333 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap starting.
I20251024 08:16:56.446573 24311 rpc_server.cc:307] RPC server started. Bound to: 127.18.80.67:40981
I20251024 08:16:56.447199 24311 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0/minicluster-data/ts-2/data/info.pb
I20251024 08:16:56.449646 24440 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.18.80.67:40981 every 8 connection(s)
I20251024 08:16:56.450400 18753 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskNzynA4/build/release/bin/kudu as pid 24311
I20251024 08:16:56.454972 24441 heartbeater.cc:344] Connected to a master server at 127.18.80.126:45025
I20251024 08:16:56.455108 24441 heartbeater.cc:461] Registering TS with master...
I20251024 08:16:56.455323 24441 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:16:56.455919 19910 ts_manager.cc:194] Re-registered known tserver with Master: e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981)
I20251024 08:16:56.456422 19910 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.18.80.67:40429
I20251024 08:16:56.575505 24333 log.cc:826] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Log is configured to *not* fsync() on all Append() calls
I20251024 08:16:56.639180 23974 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:56.644613 24242 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:56.647046 24110 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:56.657335 24358 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:16:56.950592 23932 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:56.974442 24040 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:57.109401 24176 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:57.276160 24308 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:57.457314 24441 heartbeater.cc:499] Master 127.18.80.126:45025 was elected leader, sending a full tablet report...
I20251024 08:16:57.460836 24068 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:57.777006 23932 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 2/4 log segments. Stats: ops{read=9243 overwritten=0 applied=9240 ignored=0} inserts{seen=461850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:57.820662 24333 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 1/4 log segments. Stats: ops{read=4622 overwritten=0 applied=4619 ignored=0} inserts{seen=230850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:16:58.655745 23932 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 3/4 log segments. Stats: ops{read=13867 overwritten=0 applied=13865 ignored=0} inserts{seen=693000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:58.763182 24068 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 2/4 log segments. Stats: ops{read=9244 overwritten=0 applied=9242 ignored=0} inserts{seen=461950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:59.112779 24333 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 2/4 log segments. Stats: ops{read=9243 overwritten=0 applied=9241 ignored=0} inserts{seen=461900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:59.397091 23932 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap replayed 4/4 log segments. Stats: ops{read=17758 overwritten=0 applied=17756 ignored=0} inserts{seen=887500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251024 08:16:59.397612 23932 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Bootstrap complete.
I20251024 08:16:59.403808 23932 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent bootstrapping tablet: real 3.445s	user 2.943s	sys 0.475s
I20251024 08:16:59.404816 23932 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:59.405543 23932 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Becoming Follower/Learner. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Initialized, Role: FOLLOWER
I20251024 08:16:59.405751 23932 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17756, Last appended: 11.17758, Last appended by leader: 17758, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:59.406008 23932 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
W20251024 08:16:59.452487 23954 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33892: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:16:59.698887 23954 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33892: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:16:59.730859 24486 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:16:59.731022 24486 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:16:59.731971 24486 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:16:59.735567 24130 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:16:59.735846 24384 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
W20251024 08:16:59.736658 23926 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069): Illegal state: must be running to vote when last-logged opid is not known
W20251024 08:16:59.736867 23929 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:16:59.736953 23929 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43; no voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56
I20251024 08:16:59.737088 24486 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Leader pre-election lost for term 13. Reason: could not achieve majority
I20251024 08:16:59.843852 24068 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 3/4 log segments. Stats: ops{read=13867 overwritten=0 applied=13864 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
W20251024 08:16:59.952136 23954 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33892: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:17:00.048859 24333 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 3/4 log segments. Stats: ops{read=13867 overwritten=0 applied=13864 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:17:00.145830 24486 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:17:00.145958 24486 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.146129 24486 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:17:00.146405 24384 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
I20251024 08:17:00.146409 24130 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
W20251024 08:17:00.146632 23929 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981): Illegal state: must be running to vote when last-logged opid is not known
W20251024 08:17:00.146706 23926 leader_election.cc:343] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Tablet error from VoteRequest() call to peer 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069): Illegal state: must be running to vote when last-logged opid is not known
I20251024 08:17:00.146790 23926 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43; no voters: 97d4708eb2b64571b34044be6da3d298, e9ac8f0e11a34e5fb1c19a793f211a56
I20251024 08:17:00.146932 24486 raft_consensus.cc:2749] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Leader pre-election lost for term 13. Reason: could not achieve majority
W20251024 08:17:00.207325 23954 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33892: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
W20251024 08:17:00.475406 23954 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33892: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:17:00.559720 24068 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap replayed 4/4 log segments. Stats: ops{read=17756 overwritten=0 applied=17755 ignored=0} inserts{seen=887450 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251024 08:17:00.560340 24068 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Bootstrap complete.
I20251024 08:17:00.566977 24068 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent bootstrapping tablet: real 4.481s	user 3.882s	sys 0.567s
I20251024 08:17:00.567591 24068 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.568238 24068 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Becoming Follower/Learner. State: Replica: 97d4708eb2b64571b34044be6da3d298, State: Initialized, Role: FOLLOWER
I20251024 08:17:00.568377 24068 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17755, Last appended: 11.17756, Last appended by leader: 17756, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.568639 24068 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.000s
W20251024 08:17:00.569370 24085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43682: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:17:00.744416 24333 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap replayed 4/4 log segments. Stats: ops{read=17759 overwritten=0 applied=17756 ignored=0} inserts{seen=887500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251024 08:17:00.744982 24333 tablet_bootstrap.cc:492] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Bootstrap complete.
I20251024 08:17:00.751013 24333 ts_tablet_manager.cc:1403] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent bootstrapping tablet: real 4.312s	user 3.681s	sys 0.582s
I20251024 08:17:00.751957 24333 raft_consensus.cc:359] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 12 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.752552 24333 raft_consensus.cc:740] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 12 FOLLOWER]: Becoming Follower/Learner. State: Replica: e9ac8f0e11a34e5fb1c19a793f211a56, State: Initialized, Role: FOLLOWER
I20251024 08:17:00.752666 24333 consensus_queue.cc:260] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17756, Last appended: 11.17759, Last appended by leader: 17759, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.752905 24333 ts_tablet_manager.cc:1434] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.001s
W20251024 08:17:00.759642 23954 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33892: Illegal state: replica 773ff64ed1b249db9be71c247d7cbf43 is not leader of this config: current role FOLLOWER
I20251024 08:17:00.763523 24497 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251024 08:17:00.763608 24497 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.763792 24497 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Requested pre-vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:17:00.763994 24384 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" is_pre_election: true
I20251024 08:17:00.764030 24130 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298" is_pre_election: true
I20251024 08:17:00.764142 24384 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 12 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate 773ff64ed1b249db9be71c247d7cbf43 for term 13 because replica has last-logged OpId of term: 11 index: 17759, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 17758.
I20251024 08:17:00.764163 24130 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 773ff64ed1b249db9be71c247d7cbf43 in term 12.
I20251024 08:17:00.764348 23926 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298; no voters: 
I20251024 08:17:00.764480 24497 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Leader pre-election won for term 13
I20251024 08:17:00.764544 24497 raft_consensus.cc:493] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251024 08:17:00.764578 24497 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 12 FOLLOWER]: Advancing to term 13
I20251024 08:17:00.765657 24497 raft_consensus.cc:515] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 13 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.765782 24497 leader_election.cc:290] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 election: Requested vote from peers e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67:40981), 97d4708eb2b64571b34044be6da3d298 (127.18.80.65:35069)
I20251024 08:17:00.765992 24130 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "97d4708eb2b64571b34044be6da3d298"
I20251024 08:17:00.766005 24384 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "29f8dacedea249e9883a604ac785b905" candidate_uuid: "773ff64ed1b249db9be71c247d7cbf43" candidate_term: 13 candidate_status { last_received { term: 11 index: 17758 } } ignore_live_leader: false dest_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56"
I20251024 08:17:00.766074 24384 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 12 FOLLOWER]: Advancing to term 13
I20251024 08:17:00.766074 24130 raft_consensus.cc:3060] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 12 FOLLOWER]: Advancing to term 13
I20251024 08:17:00.767194 24130 raft_consensus.cc:2468] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 13 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 773ff64ed1b249db9be71c247d7cbf43 in term 13.
I20251024 08:17:00.767212 24384 raft_consensus.cc:2410] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate 773ff64ed1b249db9be71c247d7cbf43 for term 13 because replica has last-logged OpId of term: 11 index: 17759, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 17758.
I20251024 08:17:00.767359 23926 leader_election.cc:304] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [CANDIDATE]: Term 13 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 773ff64ed1b249db9be71c247d7cbf43, 97d4708eb2b64571b34044be6da3d298; no voters: 
I20251024 08:17:00.767447 24497 raft_consensus.cc:2804] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 13 FOLLOWER]: Leader election won for term 13
I20251024 08:17:00.767589 24497 raft_consensus.cc:697] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [term 13 LEADER]: Becoming Leader. State: Replica: 773ff64ed1b249db9be71c247d7cbf43, State: Running, Role: LEADER
I20251024 08:17:00.767688 24497 consensus_queue.cc:237] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 17756, Committed index: 17756, Last appended: 11.17758, Last appended by leader: 17758, Current term: 13, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } }
I20251024 08:17:00.768297 19914 catalog_manager.cc:5649] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 reported cstate change: term changed from 11 to 13, leader changed from e9ac8f0e11a34e5fb1c19a793f211a56 (127.18.80.67) to 773ff64ed1b249db9be71c247d7cbf43 (127.18.80.66). New cstate: current_term: 13 leader_uuid: "773ff64ed1b249db9be71c247d7cbf43" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "773ff64ed1b249db9be71c247d7cbf43" member_type: VOTER last_known_addr { host: "127.18.80.66" port: 39115 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 } health_report { overall_health: UNKNOWN } } }
W20251024 08:17:00.793319 20543 scanner-internal.cc:458] Time spent opening tablet: real 6.026s	user 0.001s	sys 0.001s
W20251024 08:17:00.793323 20542 scanner-internal.cc:458] Time spent opening tablet: real 6.025s	user 0.001s	sys 0.001s
W20251024 08:17:00.855370 24085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43682: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
I20251024 08:17:00.880005 24384 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56 [term 13 FOLLOWER]: Refusing update from remote peer 773ff64ed1b249db9be71c247d7cbf43: Log matching property violated. Preceding OpId in replica: term: 11 index: 17759. Preceding OpId from leader: term: 13 index: 17759. (term mismatch)
I20251024 08:17:00.880097 24384 pending_rounds.cc:85] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Aborting all ops after (but not including) 17758
I20251024 08:17:00.880141 24384 pending_rounds.cc:107] T 29f8dacedea249e9883a604ac785b905 P e9ac8f0e11a34e5fb1c19a793f211a56: Aborting uncommitted WRITE_OP operation due to leader change: 11.17759
I20251024 08:17:00.880573 24497 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [LEADER]: Connected to new peer: Peer: permanent_uuid: "e9ac8f0e11a34e5fb1c19a793f211a56" member_type: VOTER last_known_addr { host: "127.18.80.67" port: 40981 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17759, Last known committed idx: 17756, Time since last communication: 0.000s
I20251024 08:17:00.882337 24506 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:33892 (ReqId={client: e56ad7f9e267443db66bcd5b9fdd6ff9, seq_no=17750, attempt_no=76}) took 1431 ms. Trace:
I20251024 08:17:00.882443 24130 raft_consensus.cc:1275] T 29f8dacedea249e9883a604ac785b905 P 97d4708eb2b64571b34044be6da3d298 [term 13 FOLLOWER]: Refusing update from remote peer 773ff64ed1b249db9be71c247d7cbf43: Log matching property violated. Preceding OpId in replica: term: 11 index: 17756. Preceding OpId from leader: term: 13 index: 17759. (index mismatch)
I20251024 08:17:00.882495 24506 rpcz_store.cc:276] 1024 08:16:59.450823 (+     0us) service_pool.cc:168] Inserting onto call queue
1024 08:16:59.450848 (+    25us) service_pool.cc:225] Handling call
1024 08:17:00.882311 (+1431463us) inbound_call.cc:173] Queueing success response
Metrics: {}
I20251024 08:17:00.882704 24498 consensus_queue.cc:1048] T 29f8dacedea249e9883a604ac785b905 P 773ff64ed1b249db9be71c247d7cbf43 [LEADER]: Connected to new peer: Peer: permanent_uuid: "97d4708eb2b64571b34044be6da3d298" member_type: VOTER last_known_addr { host: "127.18.80.65" port: 35069 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17759, Last known committed idx: 17755, Time since last communication: 0.000s
I20251024 08:17:00.884091 24507 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:33892 (ReqId={client: e56ad7f9e267443db66bcd5b9fdd6ff9, seq_no=17751, attempt_no=76}) took 1411 ms. Trace:
I20251024 08:17:00.884155 24507 rpcz_store.cc:276] 1024 08:16:59.472229 (+     0us) service_pool.cc:168] Inserting onto call queue
1024 08:16:59.472276 (+    47us) service_pool.cc:225] Handling call
1024 08:17:00.884083 (+1411807us) inbound_call.cc:173] Queueing success response
Metrics: {}
W20251024 08:17:00.885421 24085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43682: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:17:00.885847 24085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43682: Illegal state: replica 97d4708eb2b64571b34044be6da3d298 is not leader of this config: current role FOLLOWER
W20251024 08:17:01.296036 20544 scanner-internal.cc:458] Time spent opening tablet: real 6.008s	user 0.002s	sys 0.000s
I20251024 08:17:01.927006 23974 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251024 08:17:01.929016 24358 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251024 08:17:01.933459 24242 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:17:01.934404 24110 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251024 08:17:02.211769 19910 ts_manager.cc:284] Unset tserver state for 773ff64ed1b249db9be71c247d7cbf43 from MAINTENANCE_MODE
I20251024 08:17:02.212461 19914 ts_manager.cc:284] Unset tserver state for 36b0ebc9a5694a778497ec8d94aba993 from MAINTENANCE_MODE
I20251024 08:17:02.280050 24308 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:02.335103 19914 ts_manager.cc:284] Unset tserver state for e9ac8f0e11a34e5fb1c19a793f211a56 from MAINTENANCE_MODE
I20251024 08:17:02.340948 19914 ts_manager.cc:284] Unset tserver state for 97d4708eb2b64571b34044be6da3d298 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:403: Failure
Failed
Timed out waiting for assertion to pass.
I20251024 08:17:02.885730 24176 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:02.887259 24441 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:02.892647 24040 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:03.281037 24308 heartbeater.cc:507] Master 127.18.80.126:45025 requested a full tablet report, sending...
I20251024 08:17:04.111250 18753 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20251024 08:17:04.111372 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 97d4708eb2b64571b34044be6da3d298 and pid 24045
************************ BEGIN STACKS **************************
[New LWP 24048]
[New LWP 24049]
[New LWP 24050]
[New LWP 24051]
[New LWP 24057]
[New LWP 24058]
[New LWP 24059]
[New LWP 24062]
[New LWP 24063]
[New LWP 24064]
[New LWP 24065]
[New LWP 24066]
[New LWP 24067]
[New LWP 24069]
[New LWP 24070]
[New LWP 24071]
[New LWP 24072]
[New LWP 24073]
[New LWP 24074]
[New LWP 24075]
[New LWP 24076]
[New LWP 24077]
[New LWP 24078]
[New LWP 24079]
[New LWP 24080]
[New LWP 24081]
[New LWP 24082]
[New LWP 24083]
[New LWP 24084]
[New LWP 24085]
[New LWP 24086]
[New LWP 24087]
[New LWP 24088]
[New LWP 24089]
[New LWP 24090]
[New LWP 24091]
[New LWP 24092]
[New LWP 24093]
[New LWP 24094]
[New LWP 24095]
[New LWP 24096]
[New LWP 24097]
[New LWP 24098]
[New LWP 24099]
[New LWP 24100]
[New LWP 24101]
[New LWP 24102]
[New LWP 24103]
[New LWP 24104]
[New LWP 24105]
[New LWP 24106]
[New LWP 24107]
[New LWP 24108]
[New LWP 24109]
[New LWP 24110]
[New LWP 24111]
[New LWP 24112]
[New LWP 24113]
[New LWP 24114]
[New LWP 24115]
[New LWP 24116]
[New LWP 24117]
[New LWP 24118]
[New LWP 24119]
[New LWP 24120]
[New LWP 24121]
[New LWP 24122]
[New LWP 24123]
[New LWP 24124]
[New LWP 24125]
[New LWP 24126]
[New LWP 24127]
[New LWP 24128]
[New LWP 24129]
[New LWP 24130]
[New LWP 24131]
[New LWP 24132]
[New LWP 24133]
[New LWP 24134]
[New LWP 24135]
[New LWP 24136]
[New LWP 24137]
[New LWP 24138]
[New LWP 24139]
[New LWP 24140]
[New LWP 24141]
[New LWP 24142]
[New LWP 24143]
[New LWP 24144]
[New LWP 24145]
[New LWP 24146]
[New LWP 24147]
[New LWP 24148]
[New LWP 24149]
[New LWP 24150]
[New LWP 24151]
[New LWP 24152]
[New LWP 24153]
[New LWP 24154]
[New LWP 24155]
[New LWP 24156]
[New LWP 24157]
[New LWP 24158]
[New LWP 24159]
[New LWP 24160]
[New LWP 24161]
[New LWP 24162]
[New LWP 24163]
[New LWP 24164]
[New LWP 24165]
[New LWP 24166]
[New LWP 24167]
[New LWP 24168]
[New LWP 24169]
[New LWP 24170]
[New LWP 24171]
[New LWP 24172]
[New LWP 24173]
[New LWP 24174]
[New LWP 24175]
[New LWP 24176]
[New LWP 24177]
0x00007fce8e741d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 24045 "kudu"  0x00007fce8e741d50 in ?? ()
  2    LWP 24048 "kudu"  0x00007fce8e73dfb9 in ?? ()
  3    LWP 24049 "kudu"  0x00007fce8e73dfb9 in ?? ()
  4    LWP 24050 "kudu"  0x00007fce8e73dfb9 in ?? ()
  5    LWP 24051 "kernel-watcher-" 0x00007fce8e73dfb9 in ?? ()
  6    LWP 24057 "ntp client-2405" 0x00007fce8e7419e2 in ?? ()
  7    LWP 24058 "file cache-evic" 0x00007fce8e73dfb9 in ?? ()
  8    LWP 24059 "sq_acceptor" 0x00007fce8c852cb9 in ?? ()
  9    LWP 24062 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  10   LWP 24063 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  11   LWP 24064 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  12   LWP 24065 "rpc reactor-240" 0x00007fce8c85fa47 in ?? ()
  13   LWP 24066 "MaintenanceMgr " 0x00007fce8e73dad3 in ?? ()
  14   LWP 24067 "txn-status-mana" 0x00007fce8e73dfb9 in ?? ()
  15   LWP 24069 "collect_and_rem" 0x00007fce8e73dfb9 in ?? ()
  16   LWP 24070 "tc-session-exp-" 0x00007fce8e73dfb9 in ?? ()
  17   LWP 24071 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  18   LWP 24072 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  19   LWP 24073 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  20   LWP 24074 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  21   LWP 24075 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  22   LWP 24076 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  23   LWP 24077 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  24   LWP 24078 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  25   LWP 24079 "rpc worker-2407" 0x00007fce8e73dad3 in ?? ()
  26   LWP 24080 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  27   LWP 24081 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  28   LWP 24082 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  29   LWP 24083 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  30   LWP 24084 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  31   LWP 24085 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  32   LWP 24086 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  33   LWP 24087 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  34   LWP 24088 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  35   LWP 24089 "rpc worker-2408" 0x00007fce8e73dad3 in ?? ()
  36   LWP 24090 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  37   LWP 24091 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  38   LWP 24092 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  39   LWP 24093 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  40   LWP 24094 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  41   LWP 24095 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  42   LWP 24096 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  43   LWP 24097 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  44   LWP 24098 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  45   LWP 24099 "rpc worker-2409" 0x00007fce8e73dad3 in ?? ()
  46   LWP 24100 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  47   LWP 24101 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  48   LWP 24102 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  49   LWP 24103 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  50   LWP 24104 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  51   LWP 24105 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  52   LWP 24106 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  53   LWP 24107 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  54   LWP 24108 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  55   LWP 24109 "rpc worker-2410" 0x00007fce8e73dad3 in ?? ()
  56   LWP 24110 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  57   LWP 24111 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  58   LWP 24112 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  59   LWP 24113 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  60   LWP 24114 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  61   LWP 24115 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  62   LWP 24116 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  63   LWP 24117 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  64   LWP 24118 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  65   LWP 24119 "rpc worker-2411" 0x00007fce8e73dad3 in ?? ()
  66   LWP 24120 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  67   LWP 24121 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  68   LWP 24122 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  69   LWP 24123 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  70   LWP 24124 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  71   LWP 24125 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  72   LWP 24126 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  73   LWP 24127 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  74   LWP 24128 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  75   LWP 24129 "rpc worker-2412" 0x00007fce8e73dad3 in ?? ()
  76   LWP 24130 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  77   LWP 24131 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  78   LWP 24132 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  79   LWP 24133 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  80   LWP 24134 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  81   LWP 24135 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  82   LWP 24136 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  83   LWP 24137 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  84   LWP 24138 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  85   LWP 24139 "rpc worker-2413" 0x00007fce8e73dad3 in ?? ()
  86   LWP 24140 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  87   LWP 24141 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  88   LWP 24142 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  89   LWP 24143 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  90   LWP 24144 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  91   LWP 24145 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  92   LWP 24146 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  93   LWP 24147 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  94   LWP 24148 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  95   LWP 24149 "rpc worker-2414" 0x00007fce8e73dad3 in ?? ()
  96   LWP 24150 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  97   LWP 24151 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  98   LWP 24152 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  99   LWP 24153 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  100  LWP 24154 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  101  LWP 24155 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  102  LWP 24156 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  103  LWP 24157 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  104  LWP 24158 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  105  LWP 24159 "rpc worker-2415" 0x00007fce8e73dad3 in ?? ()
  106  LWP 24160 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  107  LWP 24161 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  108  LWP 24162 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  109  LWP 24163 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  110  LWP 24164 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  111  LWP 24165 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  112  LWP 24166 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  113  LWP 24167 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  114  LWP 24168 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  115  LWP 24169 "rpc worker-2416" 0x00007fce8e73dad3 in ?? ()
  116  LWP 24170 "rpc worker-2417" 0x00007fce8e73dad3 in ?? ()
  117  LWP 24171 "diag-logger-241" 0x00007fce8e73dfb9 in ?? ()
  118  LWP 24172 "result-tracker-" 0x00007fce8e73dfb9 in ?? ()
  119  LWP 24173 "excess-log-dele" 0x00007fce8e73dfb9 in ?? ()
  120  LWP 24174 "tcmalloc-memory" 0x00007fce8e73dfb9 in ?? ()
  121  LWP 24175 "acceptor-24175" 0x00007fce8c8610c7 in ?? ()
  122  LWP 24176 "heartbeat-24176" 0x00007fce8e73dfb9 in ?? ()
  123  LWP 24177 "maintenance_sch" 0x00007fce8e73dfb9 in ?? ()

Thread 123 (LWP 24177):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af22639e50 in ?? ()
#5  0x00007fce44fdf470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24176):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af225a3930 in ?? ()
#5  0x00007fce457e03f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 24175):
#0  0x00007fce8c8610c7 in ?? ()
#1  0x00007fce45fe1020 in ?? ()
#2  0x00007fce8e3c1c02 in ?? ()
#3  0x00007fce45fe1020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fce45fe13e0 in ?? ()
#6  0x00007fce45fe1090 in ?? ()
#7  0x000055af2255f0f8 in ?? ()
#8  0x00007fce8e3c7699 in ?? ()
#9  0x00007fce45fe1510 in ?? ()
#10 0x00007fce45fe1700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fce8e7413a7 in ?? ()
#13 0x00007fce45fe2520 in ?? ()
#14 0x00007fce45fe1260 in ?? ()
#15 0x000055af225ff0c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24174):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd18965c00 in ?? ()
#5  0x00007fce467e2670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24173):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24172):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224d7b70 in ?? ()
#5  0x00007fce477e4680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24171):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af22853390 in ?? ()
#5  0x00007fce47fe5550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24170):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 24169):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 24168):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24167):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24166):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24165):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24164):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24163):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24162):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24161):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24160):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24159):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24158):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24157):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24156):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24155):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24154):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24153):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24152):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000008 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af22855738 in ?? ()
#4  0x00007fce517f85d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce517f85f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 97 (LWP 24151):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24150):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24149):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24148):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24147):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24146):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24145):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24144):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24143):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24142):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24141):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24140):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24139):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24138):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24137):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24136):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 24135):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 24134):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 24133):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 24132):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 24131):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 24130):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000324 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af2281f138 in ?? ()
#4  0x00007fce5c80e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce5c80e5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 24129):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x000000000000023b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055af2281f0bc in ?? ()
#4  0x00007fce5d00f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce5d00f5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055af2281f0a8 in ?? ()
#9  0x00007fce8e73d770 in ?? ()
#10 0x00007fce5d00f5f0 in ?? ()
#11 0x00007fce5d00f650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 74 (LWP 24128):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 24127):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 24126):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 24125):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 24124):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 24123):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 24122):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 24121):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 24120):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 24119):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 24118):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 24117):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 24116):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 24115):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 24114):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 24113):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 24112):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 24111):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 24110):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af2281e638 in ?? ()
#4  0x00007fce668225d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce668225f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 24109):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 24108):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 24107):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 24106):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 24105):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 24104):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 24103):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 24102):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 24101):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 24100):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 24099):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 24098):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 24097):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 24096):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 24095):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 24094):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 24093):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 24092):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 24091):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 24090):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055af22741b3c in ?? ()
#4  0x00007fce708365d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce708365f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055af22741b28 in ?? ()
#9  0x00007fce8e73d770 in ?? ()
#10 0x00007fce708365f0 in ?? ()
#11 0x00007fce70836650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 24089):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 24088):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 24087):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 24086):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000034 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055af22741a38 in ?? ()
#4  0x00007fce7283a5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce7283a5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 31 (LWP 24085):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000025 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055af22741abc in ?? ()
#4  0x00007fce7303b5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fce7303b5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055af22741aa8 in ?? ()
#9  0x00007fce8e73d770 in ?? ()
#10 0x00007fce7303b5f0 in ?? ()
#11 0x00007fce7303b650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 30 (LWP 24084):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 24083):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 24082):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 24081):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 24080):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 24079):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 24078):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 24077):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 24076):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 24075):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 24074):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 24073):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 24072):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 24071):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 24070):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 24069):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bd6c8 in ?? ()
#5  0x00007fce7b04b6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 24067):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 24066):
#0  0x00007fce8e73dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 24065):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7d04f680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b4e58 in ?? ()
#5  0x00007fce7d04f6c0 in ?? ()
#6  0x00007fce7d04f840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fb958e7fce54000 in ?? ()
#10 0x000055af225a6c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a6c00 in ?? ()
#13 0x00000000225b4e58 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688d in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7d04f720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb958e7fce54000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 24064):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7d850680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b5a98 in ?? ()
#5  0x00007fce7d8506c0 in ?? ()
#6  0x00007fce7d850840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fa8dc2cbc0f8000 in ?? ()
#10 0x000055af225a6100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a6100 in ?? ()
#13 0x00000000225b5a98 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7d850720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa8dc2cbc0f8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 24063):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7e051680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b5c58 in ?? ()
#5  0x00007fce7e0516c0 in ?? ()
#6  0x00007fce7e051840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fb97fc24b13c000 in ?? ()
#10 0x000055af225a5600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a5600 in ?? ()
#13 0x00000000225b5c58 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7e051720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97fc24b13c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 24062):
#0  0x00007fce8c85fa47 in ?? ()
#1  0x00007fce7fe41680 in ?? ()
#2  0x00007fce87b63571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055af225b5e18 in ?? ()
#5  0x00007fce7fe416c0 in ?? ()
#6  0x00007fce7fe41840 in ?? ()
#7  0x000055af22656c10 in ?? ()
#8  0x00007fce87b6525d in ?? ()
#9  0x3fb9777166164000 in ?? ()
#10 0x000055af225a5b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055af225a5b80 in ?? ()
#13 0x00000000225b5e18 in ?? ()
#14 0x000055af00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055af22656c10 in ?? ()
#17 0x00007fce7fe41720 in ?? ()
#18 0x00007fce87b69ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 24059):
#0  0x00007fce8c852cb9 in ?? ()
#1  0x00007fce81644840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 24058):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 24057):
#0  0x00007fce8e7419e2 in ?? ()
#1  0x000055af224d7ee0 in ?? ()
#2  0x00007fce806424d0 in ?? ()
#3  0x00007fce80642450 in ?? ()
#4  0x00007fce80642570 in ?? ()
#5  0x00007fce80642790 in ?? ()
#6  0x00007fce806427a0 in ?? ()
#7  0x00007fce806424e0 in ?? ()
#8  0x00007fce806424d0 in ?? ()
#9  0x000055af224d6350 in ?? ()
#10 0x00007fce8eb2cc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 24051):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af2265cdc8 in ?? ()
#5  0x00007fce82646430 in ?? ()
#6  0x0000000000000054 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 24050):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bc848 in ?? ()
#5  0x00007fce82e47790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 24049):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bc2a8 in ?? ()
#5  0x00007fce83648790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 24048):
#0  0x00007fce8e73dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055af224bc188 in ?? ()
#5  0x00007fce83e49790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 24045):
#0  0x00007fce8e741d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:04.599313 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID 773ff64ed1b249db9be71c247d7cbf43 and pid 23911
************************ BEGIN STACKS **************************
[New LWP 23912]
[New LWP 23913]
[New LWP 23914]
[New LWP 23915]
[New LWP 23921]
[New LWP 23922]
[New LWP 23923]
[New LWP 23926]
[New LWP 23927]
[New LWP 23928]
[New LWP 23929]
[New LWP 23930]
[New LWP 23931]
[New LWP 23933]
[New LWP 23934]
[New LWP 23935]
[New LWP 23936]
[New LWP 23937]
[New LWP 23938]
[New LWP 23939]
[New LWP 23940]
[New LWP 23941]
[New LWP 23942]
[New LWP 23943]
[New LWP 23944]
[New LWP 23945]
[New LWP 23946]
[New LWP 23947]
[New LWP 23948]
[New LWP 23949]
[New LWP 23950]
[New LWP 23951]
[New LWP 23952]
[New LWP 23953]
[New LWP 23954]
[New LWP 23955]
[New LWP 23956]
[New LWP 23957]
[New LWP 23958]
[New LWP 23959]
[New LWP 23960]
[New LWP 23961]
[New LWP 23962]
[New LWP 23963]
[New LWP 23964]
[New LWP 23965]
[New LWP 23966]
[New LWP 23967]
[New LWP 23968]
[New LWP 23969]
[New LWP 23970]
[New LWP 23971]
[New LWP 23972]
[New LWP 23973]
[New LWP 23974]
[New LWP 23975]
[New LWP 23976]
[New LWP 23977]
[New LWP 23978]
[New LWP 23979]
[New LWP 23980]
[New LWP 23981]
[New LWP 23982]
[New LWP 23983]
[New LWP 23984]
[New LWP 23985]
[New LWP 23986]
[New LWP 23987]
[New LWP 23988]
[New LWP 23989]
[New LWP 23990]
[New LWP 23991]
[New LWP 23992]
[New LWP 23993]
[New LWP 23994]
[New LWP 23995]
[New LWP 23996]
[New LWP 23997]
[New LWP 23998]
[New LWP 23999]
[New LWP 24000]
[New LWP 24001]
[New LWP 24002]
[New LWP 24003]
[New LWP 24004]
[New LWP 24005]
[New LWP 24006]
[New LWP 24007]
[New LWP 24008]
[New LWP 24009]
[New LWP 24010]
[New LWP 24011]
[New LWP 24012]
[New LWP 24013]
[New LWP 24014]
[New LWP 24015]
[New LWP 24016]
[New LWP 24017]
[New LWP 24018]
[New LWP 24019]
[New LWP 24020]
[New LWP 24021]
[New LWP 24022]
[New LWP 24023]
[New LWP 24024]
[New LWP 24025]
[New LWP 24026]
[New LWP 24027]
[New LWP 24028]
[New LWP 24029]
[New LWP 24030]
[New LWP 24031]
[New LWP 24032]
[New LWP 24033]
[New LWP 24034]
[New LWP 24035]
[New LWP 24036]
[New LWP 24037]
[New LWP 24038]
[New LWP 24039]
[New LWP 24040]
[New LWP 24041]
[New LWP 24611]
0x00007f006999cd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 23911 "kudu"  0x00007f006999cd50 in ?? ()
  2    LWP 23912 "kudu"  0x00007f0069998fb9 in ?? ()
  3    LWP 23913 "kudu"  0x00007f0069998fb9 in ?? ()
  4    LWP 23914 "kudu"  0x00007f0069998fb9 in ?? ()
  5    LWP 23915 "kernel-watcher-" 0x00007f0069998fb9 in ?? ()
  6    LWP 23921 "ntp client-2392" 0x00007f006999c9e2 in ?? ()
  7    LWP 23922 "file cache-evic" 0x00007f0069998fb9 in ?? ()
  8    LWP 23923 "sq_acceptor" 0x00007f0067aadcb9 in ?? ()
  9    LWP 23926 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  10   LWP 23927 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  11   LWP 23928 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  12   LWP 23929 "rpc reactor-239" 0x00007f0067abaa47 in ?? ()
  13   LWP 23930 "MaintenanceMgr " 0x00007f0069998ad3 in ?? ()
  14   LWP 23931 "txn-status-mana" 0x00007f0069998fb9 in ?? ()
  15   LWP 23933 "collect_and_rem" 0x00007f0069998fb9 in ?? ()
  16   LWP 23934 "tc-session-exp-" 0x00007f0069998fb9 in ?? ()
  17   LWP 23935 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  18   LWP 23936 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  19   LWP 23937 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  20   LWP 23938 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  21   LWP 23939 "rpc worker-2393" 0x00007f0069998ad3 in ?? ()
  22   LWP 23940 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  23   LWP 23941 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  24   LWP 23942 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  25   LWP 23943 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  26   LWP 23944 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  27   LWP 23945 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  28   LWP 23946 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  29   LWP 23947 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  30   LWP 23948 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  31   LWP 23949 "rpc worker-2394" 0x00007f0069998ad3 in ?? ()
  32   LWP 23950 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  33   LWP 23951 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  34   LWP 23952 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  35   LWP 23953 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  36   LWP 23954 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  37   LWP 23955 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  38   LWP 23956 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  39   LWP 23957 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  40   LWP 23958 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  41   LWP 23959 "rpc worker-2395" 0x00007f0069998ad3 in ?? ()
  42   LWP 23960 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  43   LWP 23961 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  44   LWP 23962 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  45   LWP 23963 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  46   LWP 23964 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  47   LWP 23965 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  48   LWP 23966 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  49   LWP 23967 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  50   LWP 23968 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  51   LWP 23969 "rpc worker-2396" 0x00007f0069998ad3 in ?? ()
  52   LWP 23970 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  53   LWP 23971 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  54   LWP 23972 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  55   LWP 23973 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  56   LWP 23974 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  57   LWP 23975 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  58   LWP 23976 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  59   LWP 23977 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  60   LWP 23978 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  61   LWP 23979 "rpc worker-2397" 0x00007f0069998ad3 in ?? ()
  62   LWP 23980 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  63   LWP 23981 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  64   LWP 23982 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  65   LWP 23983 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  66   LWP 23984 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  67   LWP 23985 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  68   LWP 23986 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  69   LWP 23987 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  70   LWP 23988 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  71   LWP 23989 "rpc worker-2398" 0x00007f0069998ad3 in ?? ()
  72   LWP 23990 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  73   LWP 23991 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  74   LWP 23992 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  75   LWP 23993 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  76   LWP 23994 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  77   LWP 23995 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  78   LWP 23996 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  79   LWP 23997 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  80   LWP 23998 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  81   LWP 23999 "rpc worker-2399" 0x00007f0069998ad3 in ?? ()
  82   LWP 24000 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  83   LWP 24001 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  84   LWP 24002 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  85   LWP 24003 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  86   LWP 24004 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  87   LWP 24005 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  88   LWP 24006 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  89   LWP 24007 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  90   LWP 24008 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  91   LWP 24009 "rpc worker-2400" 0x00007f0069998ad3 in ?? ()
  92   LWP 24010 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  93   LWP 24011 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  94   LWP 24012 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  95   LWP 24013 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  96   LWP 24014 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  97   LWP 24015 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  98   LWP 24016 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  99   LWP 24017 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  100  LWP 24018 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  101  LWP 24019 "rpc worker-2401" 0x00007f0069998ad3 in ?? ()
  102  LWP 24020 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  103  LWP 24021 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  104  LWP 24022 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  105  LWP 24023 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  106  LWP 24024 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  107  LWP 24025 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  108  LWP 24026 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  109  LWP 24027 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  110  LWP 24028 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  111  LWP 24029 "rpc worker-2402" 0x00007f0069998ad3 in ?? ()
  112  LWP 24030 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  113  LWP 24031 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  114  LWP 24032 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  115  LWP 24033 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  116  LWP 24034 "rpc worker-2403" 0x00007f0069998ad3 in ?? ()
  117  LWP 24035 "diag-logger-240" 0x00007f0069998fb9 in ?? ()
  118  LWP 24036 "result-tracker-" 0x00007f0069998fb9 in ?? ()
  119  LWP 24037 "excess-log-dele" 0x00007f0069998fb9 in ?? ()
  120  LWP 24038 "tcmalloc-memory" 0x00007f0069998fb9 in ?? ()
  121  LWP 24039 "acceptor-24039" 0x00007f0067abc0c7 in ?? ()
  122  LWP 24040 "heartbeat-24040" 0x00007f0069998fb9 in ?? ()
  123  LWP 24041 "maintenance_sch" 0x00007f0069998fb9 in ?? ()
  124  LWP 24611 "raft [worker]-2" 0x00007f0069998fb9 in ?? ()

Thread 124 (LWP 24611):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000009b in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007f001f43d764 in ?? ()
#5  0x00007f001f43d510 in ?? ()
#6  0x0000000000000137 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f001f43d530 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f001f43d590 in ?? ()
#12 0x00007f006960c2e1 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 123 (LWP 24041):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae833fbe50 in ?? ()
#5  0x00007f002043f470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24040):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae83365930 in ?? ()
#5  0x00007f0020c403f0 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 24039):
#0  0x00007f0067abc0c7 in ?? ()
#1  0x00007f0021441020 in ?? ()
#2  0x00007f006961cc02 in ?? ()
#3  0x00007f0021441020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f00214413e0 in ?? ()
#6  0x00007f0021441090 in ?? ()
#7  0x000055ae833210f8 in ?? ()
#8  0x00007f0069622699 in ?? ()
#9  0x00007f0021441510 in ?? ()
#10 0x00007f0021441700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f006999c3a7 in ?? ()
#13 0x00007f0021442520 in ?? ()
#14 0x00007f0021441260 in ?? ()
#15 0x000055ae833c10c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24038):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd0726bb00 in ?? ()
#5  0x00007f0021c42670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24037):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24036):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae83299b70 in ?? ()
#5  0x00007f0022c44680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24035):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae83615390 in ?? ()
#5  0x00007f0023445550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24034):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000008 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae83616738 in ?? ()
#4  0x00007f0023c465d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0023c465f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 115 (LWP 24033):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 24032):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24031):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24030):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24029):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24028):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24027):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24026):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24025):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24024):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24023):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24022):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24021):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24020):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24019):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24018):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24017):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24016):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 24015):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24014):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24013):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24012):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24011):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24010):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24009):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24008):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24007):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24006):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24005):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24004):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24003):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24002):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24001):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24000):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 23999):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 23998):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 23997):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 23996):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 23995):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 23994):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae835e1138 in ?? ()
#4  0x00007f0037c6e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0037c6e5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 23993):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 23992):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 23991):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 23990):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 23989):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 23988):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 23987):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 23986):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 23985):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 23984):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 23983):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 23982):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 23981):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 23980):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 23979):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 23978):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 23977):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 23976):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 23975):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 23974):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae835e0638 in ?? ()
#4  0x00007f0041c825d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0041c825f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 23973):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 23972):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 23971):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 23970):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 23969):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 23968):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 23967):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 23966):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 23965):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 23964):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 23963):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 23962):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 23961):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 23960):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 23959):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 23958):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 23957):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 23956):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 23955):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 23954):
#0  0x00007f0069998ad3 in ?? ()
#1  0x000000000000023d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055ae83503b3c in ?? ()
#4  0x00007f004bc965d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f004bc965f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055ae83503b28 in ?? ()
#9  0x00007f0069998770 in ?? ()
#10 0x00007f004bc965f0 in ?? ()
#11 0x00007f004bc96650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 23953):
#0  0x00007f0069998ad3 in ?? ()
#1  0x000000000000013e in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae83503ab8 in ?? ()
#4  0x00007f004c4975d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f004c4975f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 34 (LWP 23952):
#0  0x00007f0069998ad3 in ?? ()
#1  0x00000000000001cc in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055ae83503a38 in ?? ()
#4  0x00007f004cc985d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f004cc985f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 33 (LWP 23951):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 23950):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 23949):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 23948):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 23947):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 23946):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 23945):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 23944):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 23943):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 23942):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 23941):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 23940):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 23939):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 23938):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 23937):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 23936):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 23935):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 23934):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 23933):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327f6c8 in ?? ()
#5  0x00007f00564ab6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 23931):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 23930):
#0  0x00007f0069998ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 23929):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f00584af680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83376e58 in ?? ()
#5  0x00007f00584af6c0 in ?? ()
#6  0x00007f00584af840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fb307dbec7a8000 in ?? ()
#10 0x000055ae83368c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83368c00 in ?? ()
#13 0x0000000083376e58 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e246888 in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f00584af720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb307dbec7a8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 23928):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f0058cb0680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83377a98 in ?? ()
#5  0x00007f0058cb06c0 in ?? ()
#6  0x00007f0058cb0840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fb9756cad630000 in ?? ()
#10 0x000055ae83368100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83368100 in ?? ()
#13 0x0000000083377a98 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f0058cb0720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9756cad630000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 23927):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f00594b1680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83377c58 in ?? ()
#5  0x00007f00594b16c0 in ?? ()
#6  0x00007f00594b1840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fa5a86005e80000 in ?? ()
#10 0x000055ae83367b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83367b80 in ?? ()
#13 0x0000000083377c58 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f00594b1720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa5a86005e80000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 23926):
#0  0x00007f0067abaa47 in ?? ()
#1  0x00007f005b09c680 in ?? ()
#2  0x00007f0062dbe571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055ae83377e18 in ?? ()
#5  0x00007f005b09c6c0 in ?? ()
#6  0x00007f005b09c840 in ?? ()
#7  0x000055ae83418c10 in ?? ()
#8  0x00007f0062dc025d in ?? ()
#9  0x3fa552038b710000 in ?? ()
#10 0x000055ae83367600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055ae83367600 in ?? ()
#13 0x0000000083377e18 in ?? ()
#14 0x000055ae00000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055ae83418c10 in ?? ()
#17 0x00007f005b09c720 in ?? ()
#18 0x00007f0062dc4ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 23923):
#0  0x00007f0067aadcb9 in ?? ()
#1  0x00007f005c89f840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 23922):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 23921):
#0  0x00007f006999c9e2 in ?? ()
#1  0x000055ae83299ee0 in ?? ()
#2  0x00007f005b89d4d0 in ?? ()
#3  0x00007f005b89d450 in ?? ()
#4  0x00007f005b89d570 in ?? ()
#5  0x00007f005b89d790 in ?? ()
#6  0x00007f005b89d7a0 in ?? ()
#7  0x00007f005b89d4e0 in ?? ()
#8  0x00007f005b89d4d0 in ?? ()
#9  0x000055ae83298350 in ?? ()
#10 0x00007f0069d87c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 23915):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8341edc8 in ?? ()
#5  0x00007f005d8a1430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 23914):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327e848 in ?? ()
#5  0x00007f005e0a2790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 23913):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327e2a8 in ?? ()
#5  0x00007f005e8a3790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 23912):
#0  0x00007f0069998fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055ae8327e188 in ?? ()
#5  0x00007f005f0a4790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 23911):
#0  0x00007f006999cd50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:05.088961 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID e9ac8f0e11a34e5fb1c19a793f211a56 and pid 24311
************************ BEGIN STACKS **************************
[New LWP 24313]
[New LWP 24314]
[New LWP 24315]
[New LWP 24316]
[New LWP 24322]
[New LWP 24323]
[New LWP 24324]
[New LWP 24327]
[New LWP 24328]
[New LWP 24329]
[New LWP 24330]
[New LWP 24331]
[New LWP 24332]
[New LWP 24334]
[New LWP 24335]
[New LWP 24336]
[New LWP 24337]
[New LWP 24338]
[New LWP 24339]
[New LWP 24340]
[New LWP 24341]
[New LWP 24342]
[New LWP 24343]
[New LWP 24344]
[New LWP 24345]
[New LWP 24346]
[New LWP 24347]
[New LWP 24348]
[New LWP 24349]
[New LWP 24350]
[New LWP 24351]
[New LWP 24352]
[New LWP 24353]
[New LWP 24354]
[New LWP 24355]
[New LWP 24356]
[New LWP 24357]
[New LWP 24358]
[New LWP 24359]
[New LWP 24360]
[New LWP 24361]
[New LWP 24362]
[New LWP 24363]
[New LWP 24364]
[New LWP 24365]
[New LWP 24366]
[New LWP 24367]
[New LWP 24368]
[New LWP 24369]
[New LWP 24370]
[New LWP 24371]
[New LWP 24372]
[New LWP 24373]
[New LWP 24374]
[New LWP 24375]
[New LWP 24376]
[New LWP 24377]
[New LWP 24378]
[New LWP 24379]
[New LWP 24380]
[New LWP 24381]
[New LWP 24382]
[New LWP 24383]
[New LWP 24384]
[New LWP 24385]
[New LWP 24386]
[New LWP 24387]
[New LWP 24388]
[New LWP 24389]
[New LWP 24390]
[New LWP 24391]
[New LWP 24392]
[New LWP 24393]
[New LWP 24394]
[New LWP 24395]
[New LWP 24396]
[New LWP 24397]
[New LWP 24398]
[New LWP 24399]
[New LWP 24400]
[New LWP 24401]
[New LWP 24402]
[New LWP 24403]
[New LWP 24404]
[New LWP 24405]
[New LWP 24406]
[New LWP 24407]
[New LWP 24408]
[New LWP 24409]
[New LWP 24410]
[New LWP 24411]
[New LWP 24412]
[New LWP 24413]
[New LWP 24414]
[New LWP 24415]
[New LWP 24416]
[New LWP 24417]
[New LWP 24418]
[New LWP 24419]
[New LWP 24420]
[New LWP 24421]
[New LWP 24422]
[New LWP 24423]
[New LWP 24424]
[New LWP 24425]
[New LWP 24426]
[New LWP 24427]
[New LWP 24428]
[New LWP 24429]
[New LWP 24430]
[New LWP 24431]
[New LWP 24432]
[New LWP 24433]
[New LWP 24434]
[New LWP 24435]
[New LWP 24436]
[New LWP 24437]
[New LWP 24438]
[New LWP 24439]
[New LWP 24440]
[New LWP 24441]
[New LWP 24442]
0x00007fc5186b3d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 24311 "kudu"  0x00007fc5186b3d50 in ?? ()
  2    LWP 24313 "kudu"  0x00007fc5186affb9 in ?? ()
  3    LWP 24314 "kudu"  0x00007fc5186affb9 in ?? ()
  4    LWP 24315 "kudu"  0x00007fc5186affb9 in ?? ()
  5    LWP 24316 "kernel-watcher-" 0x00007fc5186affb9 in ?? ()
  6    LWP 24322 "ntp client-2432" 0x00007fc5186b39e2 in ?? ()
  7    LWP 24323 "file cache-evic" 0x00007fc5186affb9 in ?? ()
  8    LWP 24324 "sq_acceptor" 0x00007fc5167c4cb9 in ?? ()
  9    LWP 24327 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  10   LWP 24328 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  11   LWP 24329 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  12   LWP 24330 "rpc reactor-243" 0x00007fc5167d1a47 in ?? ()
  13   LWP 24331 "MaintenanceMgr " 0x00007fc5186afad3 in ?? ()
  14   LWP 24332 "txn-status-mana" 0x00007fc5186affb9 in ?? ()
  15   LWP 24334 "collect_and_rem" 0x00007fc5186affb9 in ?? ()
  16   LWP 24335 "tc-session-exp-" 0x00007fc5186affb9 in ?? ()
  17   LWP 24336 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  18   LWP 24337 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  19   LWP 24338 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  20   LWP 24339 "rpc worker-2433" 0x00007fc5186afad3 in ?? ()
  21   LWP 24340 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  22   LWP 24341 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  23   LWP 24342 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  24   LWP 24343 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  25   LWP 24344 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  26   LWP 24345 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  27   LWP 24346 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  28   LWP 24347 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  29   LWP 24348 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  30   LWP 24349 "rpc worker-2434" 0x00007fc5186afad3 in ?? ()
  31   LWP 24350 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  32   LWP 24351 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  33   LWP 24352 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  34   LWP 24353 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  35   LWP 24354 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  36   LWP 24355 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  37   LWP 24356 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  38   LWP 24357 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  39   LWP 24358 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  40   LWP 24359 "rpc worker-2435" 0x00007fc5186afad3 in ?? ()
  41   LWP 24360 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  42   LWP 24361 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  43   LWP 24362 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  44   LWP 24363 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  45   LWP 24364 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  46   LWP 24365 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  47   LWP 24366 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  48   LWP 24367 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  49   LWP 24368 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  50   LWP 24369 "rpc worker-2436" 0x00007fc5186afad3 in ?? ()
  51   LWP 24370 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  52   LWP 24371 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  53   LWP 24372 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  54   LWP 24373 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  55   LWP 24374 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  56   LWP 24375 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  57   LWP 24376 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  58   LWP 24377 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  59   LWP 24378 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  60   LWP 24379 "rpc worker-2437" 0x00007fc5186afad3 in ?? ()
  61   LWP 24380 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  62   LWP 24381 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  63   LWP 24382 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  64   LWP 24383 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  65   LWP 24384 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  66   LWP 24385 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  67   LWP 24386 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  68   LWP 24387 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  69   LWP 24388 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  70   LWP 24389 "rpc worker-2438" 0x00007fc5186afad3 in ?? ()
  71   LWP 24390 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  72   LWP 24391 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  73   LWP 24392 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  74   LWP 24393 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  75   LWP 24394 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  76   LWP 24395 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  77   LWP 24396 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  78   LWP 24397 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  79   LWP 24398 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  80   LWP 24399 "rpc worker-2439" 0x00007fc5186afad3 in ?? ()
  81   LWP 24400 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  82   LWP 24401 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  83   LWP 24402 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  84   LWP 24403 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  85   LWP 24404 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  86   LWP 24405 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  87   LWP 24406 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  88   LWP 24407 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  89   LWP 24408 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  90   LWP 24409 "rpc worker-2440" 0x00007fc5186afad3 in ?? ()
  91   LWP 24410 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  92   LWP 24411 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  93   LWP 24412 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  94   LWP 24413 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  95   LWP 24414 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  96   LWP 24415 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  97   LWP 24416 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  98   LWP 24417 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  99   LWP 24418 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  100  LWP 24419 "rpc worker-2441" 0x00007fc5186afad3 in ?? ()
  101  LWP 24420 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  102  LWP 24421 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  103  LWP 24422 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  104  LWP 24423 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  105  LWP 24424 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  106  LWP 24425 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  107  LWP 24426 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  108  LWP 24427 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  109  LWP 24428 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  110  LWP 24429 "rpc worker-2442" 0x00007fc5186afad3 in ?? ()
  111  LWP 24430 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  112  LWP 24431 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  113  LWP 24432 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  114  LWP 24433 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  115  LWP 24434 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  116  LWP 24435 "rpc worker-2443" 0x00007fc5186afad3 in ?? ()
  117  LWP 24436 "diag-logger-244" 0x00007fc5186affb9 in ?? ()
  118  LWP 24437 "result-tracker-" 0x00007fc5186affb9 in ?? ()
  119  LWP 24438 "excess-log-dele" 0x00007fc5186affb9 in ?? ()
  120  LWP 24439 "tcmalloc-memory" 0x00007fc5186affb9 in ?? ()
  121  LWP 24440 "acceptor-24440" 0x00007fc5167d30c7 in ?? ()
  122  LWP 24441 "heartbeat-24441" 0x00007fc5186affb9 in ?? ()
  123  LWP 24442 "maintenance_sch" 0x00007fc5186affb9 in ?? ()

Thread 123 (LWP 24442):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000024 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6efbbe50 in ?? ()
#5  0x00007fc4cf156470 in ?? ()
#6  0x0000000000000048 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24441):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000b in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055be6ef25934 in ?? ()
#5  0x00007fc4cf9573f0 in ?? ()
#6  0x0000000000000017 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fc4cf957410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fc4cf957470 in ?? ()
#12 0x00007fc5183232e1 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 24440):
#0  0x00007fc5167d30c7 in ?? ()
#1  0x00007fc4d0158020 in ?? ()
#2  0x00007fc518333c02 in ?? ()
#3  0x00007fc4d0158020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fc4d01583e0 in ?? ()
#6  0x00007fc4d0158090 in ?? ()
#7  0x000055be6eee10f8 in ?? ()
#8  0x00007fc518339699 in ?? ()
#9  0x00007fc4d0158510 in ?? ()
#10 0x00007fc4d0158700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fc5186b33a7 in ?? ()
#13 0x00007fc4d0159520 in ?? ()
#14 0x00007fc4d0158260 in ?? ()
#15 0x000055be6ef810c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24439):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc68d41cc0 in ?? ()
#5  0x00007fc4d0959670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24438):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24437):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee59b70 in ?? ()
#5  0x00007fc4d195b680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24436):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6f1d0790 in ?? ()
#5  0x00007fc4d215c550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24435):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6f1d933c in ?? ()
#4  0x00007fc4d295d5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4d295d5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6f1d9328 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc4d295d5f0 in ?? ()
#11 0x00007fc4d295d650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 24434):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6f1d92bc in ?? ()
#4  0x00007fc4d315e5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4d315e5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6f1d92a8 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc4d315e5f0 in ?? ()
#11 0x00007fc4d315e650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 24433):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24432):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24431):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24430):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24429):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24428):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24427):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24426):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24425):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24424):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24423):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24422):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24421):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24420):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24419):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24418):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24417):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 24416):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24415):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24414):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24413):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24412):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24411):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24410):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24409):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24408):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24407):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24406):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24405):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24404):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24403):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24402):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24401):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 24400):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 24399):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 24398):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 24397):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 24396):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 24395):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 24394):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 24393):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 24392):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 24391):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 24390):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 24389):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 24388):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 24387):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 24386):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 24385):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 24384):
#0  0x00007fc5186afad3 in ?? ()
#1  0x00000000000004a8 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d8f38 in ?? ()
#4  0x00007fc4ec1905d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4ec1905f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 64 (LWP 24383):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 24382):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 24381):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 24380):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 24379):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 24378):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 24377):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 24376):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 24375):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 24374):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 24373):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 24372):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 24371):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 24370):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 24369):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 24368):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 24367):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 24366):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 24365):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 24364):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 24363):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 24362):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 24361):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 24360):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 24359):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 24358):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d9038 in ?? ()
#4  0x00007fc4f91aa5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc4f91aa5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 38 (LWP 24357):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 24356):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 24355):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 24354):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 24353):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 24352):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 24351):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 24350):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 24349):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 24348):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 24347):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 24346):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 24345):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 24344):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 24343):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 24342):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 24341):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 24340):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 24339):
#0  0x00007fc5186afad3 in ?? ()
#1  0x000000000000178c in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d9638 in ?? ()
#4  0x00007fc5029bd5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5029bd5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 19 (LWP 24338):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000001c70 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055be6f1d96b8 in ?? ()
#4  0x00007fc5031be5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5031be5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 18 (LWP 24337):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000001e31 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6f1d973c in ?? ()
#4  0x00007fc5039bf5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5039bf5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6f1d9728 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc5039bf5f0 in ?? ()
#11 0x00007fc5039bf650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 17 (LWP 24336):
#0  0x00007fc5186afad3 in ?? ()
#1  0x000000000000046d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055be6eecdcbc in ?? ()
#4  0x00007fc5041c05d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fc5041c05f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055be6eecdca8 in ?? ()
#9  0x00007fc5186af770 in ?? ()
#10 0x00007fc5041c05f0 in ?? ()
#11 0x00007fc5041c0650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 16 (LWP 24335):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 24334):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3f6c8 in ?? ()
#5  0x00007fc5051c26a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 24332):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 24331):
#0  0x00007fc5186afad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 24330):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc5071c6680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef36e58 in ?? ()
#5  0x00007fc5071c66c0 in ?? ()
#6  0x00007fc5071c6840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fad57813d928000 in ?? ()
#10 0x000055be6ef28c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef28c00 in ?? ()
#13 0x000000006ef36e58 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc5071c6720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 11 (LWP 24329):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc5079c7680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef37a98 in ?? ()
#5  0x00007fc5079c76c0 in ?? ()
#6  0x00007fc5079c7840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fb989d0b0a08000 in ?? ()
#10 0x000055be6ef27600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef27600 in ?? ()
#13 0x000000006ef37a98 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e246889 in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc5079c7720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb989d0b0a08000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 24328):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc5081c8680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef37c58 in ?? ()
#5  0x00007fc5081c86c0 in ?? ()
#6  0x00007fc5081c8840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fb961f75ef14000 in ?? ()
#10 0x000055be6ef27b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef27b80 in ?? ()
#13 0x000000006ef37c58 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc5081c8720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb961f75ef14000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 24327):
#0  0x00007fc5167d1a47 in ?? ()
#1  0x00007fc509db3680 in ?? ()
#2  0x00007fc511ad5571 in ?? ()
#3  0x00000000000002b7 in ?? ()
#4  0x000055be6ef37e18 in ?? ()
#5  0x00007fc509db36c0 in ?? ()
#6  0x00007fc509db3840 in ?? ()
#7  0x000055be6efd8c10 in ?? ()
#8  0x00007fc511ad725d in ?? ()
#9  0x3fb95f1fb9ea4000 in ?? ()
#10 0x000055be6ef28680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055be6ef28680 in ?? ()
#13 0x000000006ef37e18 in ?? ()
#14 0x000055be00000000 in ?? ()
#15 0x41da3ecc3e24688b in ?? ()
#16 0x000055be6efd8c10 in ?? ()
#17 0x00007fc509db3720 in ?? ()
#18 0x00007fc511adbba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95f1fb9ea4000 in ?? ()
#21 0x000000006e5c80a0 in ?? ()
#22 0x000055be6ef28680 in ?? ()
#23 0x00007fc509db3860 in ?? ()
#24 0x3fb95f1fb9ea4000 in ?? ()
#25 0x0000000000000000 in ?? ()

Thread 8 (LWP 24324):
#0  0x00007fc5167c4cb9 in ?? ()
#1  0x00007fc50b5b6840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 24323):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 24322):
#0  0x00007fc5186b39e2 in ?? ()
#1  0x000055be6ee59ee0 in ?? ()
#2  0x00007fc50a5b44d0 in ?? ()
#3  0x00007fc50a5b4450 in ?? ()
#4  0x00007fc50a5b4570 in ?? ()
#5  0x00007fc50a5b4790 in ?? ()
#6  0x00007fc50a5b47a0 in ?? ()
#7  0x00007fc50a5b44e0 in ?? ()
#8  0x00007fc50a5b44d0 in ?? ()
#9  0x000055be6ee58350 in ?? ()
#10 0x00007fc518a9ec6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 24316):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6efdedc8 in ?? ()
#5  0x00007fc50c5b8430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 24315):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3e848 in ?? ()
#5  0x00007fc50cdb9790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 24314):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3e2a8 in ?? ()
#5  0x00007fc50d5ba790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 24313):
#0  0x00007fc5186affb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055be6ee3e188 in ?? ()
#5  0x00007fc50ddbb790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 24311):
#0  0x00007fc5186b3d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:05.587875 18753 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID 36b0ebc9a5694a778497ec8d94aba993 and pid 24179
************************ BEGIN STACKS **************************
[New LWP 24181]
[New LWP 24182]
[New LWP 24183]
[New LWP 24184]
[New LWP 24190]
[New LWP 24191]
[New LWP 24192]
[New LWP 24195]
[New LWP 24196]
[New LWP 24197]
[New LWP 24198]
[New LWP 24199]
[New LWP 24200]
[New LWP 24201]
[New LWP 24202]
[New LWP 24203]
[New LWP 24204]
[New LWP 24205]
[New LWP 24206]
[New LWP 24207]
[New LWP 24208]
[New LWP 24209]
[New LWP 24210]
[New LWP 24211]
[New LWP 24212]
[New LWP 24213]
[New LWP 24214]
[New LWP 24215]
[New LWP 24216]
[New LWP 24217]
[New LWP 24218]
[New LWP 24219]
[New LWP 24220]
[New LWP 24221]
[New LWP 24222]
[New LWP 24223]
[New LWP 24224]
[New LWP 24225]
[New LWP 24226]
[New LWP 24227]
[New LWP 24228]
[New LWP 24229]
[New LWP 24230]
[New LWP 24231]
[New LWP 24232]
[New LWP 24233]
[New LWP 24234]
[New LWP 24235]
[New LWP 24236]
[New LWP 24237]
[New LWP 24238]
[New LWP 24239]
[New LWP 24240]
[New LWP 24241]
[New LWP 24242]
[New LWP 24243]
[New LWP 24244]
[New LWP 24245]
[New LWP 24246]
[New LWP 24247]
[New LWP 24248]
[New LWP 24249]
[New LWP 24250]
[New LWP 24251]
[New LWP 24252]
[New LWP 24253]
[New LWP 24254]
[New LWP 24255]
[New LWP 24256]
[New LWP 24257]
[New LWP 24258]
[New LWP 24259]
[New LWP 24260]
[New LWP 24261]
[New LWP 24262]
[New LWP 24263]
[New LWP 24264]
[New LWP 24265]
[New LWP 24266]
[New LWP 24267]
[New LWP 24268]
[New LWP 24269]
[New LWP 24270]
[New LWP 24271]
[New LWP 24272]
[New LWP 24273]
[New LWP 24274]
[New LWP 24275]
[New LWP 24276]
[New LWP 24277]
[New LWP 24278]
[New LWP 24279]
[New LWP 24280]
[New LWP 24281]
[New LWP 24282]
[New LWP 24283]
[New LWP 24284]
[New LWP 24285]
[New LWP 24286]
[New LWP 24287]
[New LWP 24288]
[New LWP 24289]
[New LWP 24290]
[New LWP 24291]
[New LWP 24292]
[New LWP 24293]
[New LWP 24294]
[New LWP 24295]
[New LWP 24296]
[New LWP 24297]
[New LWP 24298]
[New LWP 24299]
[New LWP 24300]
[New LWP 24301]
[New LWP 24302]
[New LWP 24303]
[New LWP 24304]
[New LWP 24305]
[New LWP 24306]
[New LWP 24307]
[New LWP 24308]
[New LWP 24309]
0x00007f5349e84d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 24179 "kudu"  0x00007f5349e84d50 in ?? ()
  2    LWP 24181 "kudu"  0x00007f5349e80fb9 in ?? ()
  3    LWP 24182 "kudu"  0x00007f5349e80fb9 in ?? ()
  4    LWP 24183 "kudu"  0x00007f5349e80fb9 in ?? ()
  5    LWP 24184 "kernel-watcher-" 0x00007f5349e80fb9 in ?? ()
  6    LWP 24190 "ntp client-2419" 0x00007f5349e849e2 in ?? ()
  7    LWP 24191 "file cache-evic" 0x00007f5349e80fb9 in ?? ()
  8    LWP 24192 "sq_acceptor" 0x00007f5347f95cb9 in ?? ()
  9    LWP 24195 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  10   LWP 24196 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  11   LWP 24197 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  12   LWP 24198 "rpc reactor-241" 0x00007f5347fa2a47 in ?? ()
  13   LWP 24199 "MaintenanceMgr " 0x00007f5349e80ad3 in ?? ()
  14   LWP 24200 "txn-status-mana" 0x00007f5349e80fb9 in ?? ()
  15   LWP 24201 "collect_and_rem" 0x00007f5349e80fb9 in ?? ()
  16   LWP 24202 "tc-session-exp-" 0x00007f5349e80fb9 in ?? ()
  17   LWP 24203 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  18   LWP 24204 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  19   LWP 24205 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  20   LWP 24206 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  21   LWP 24207 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  22   LWP 24208 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  23   LWP 24209 "rpc worker-2420" 0x00007f5349e80ad3 in ?? ()
  24   LWP 24210 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  25   LWP 24211 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  26   LWP 24212 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  27   LWP 24213 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  28   LWP 24214 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  29   LWP 24215 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  30   LWP 24216 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  31   LWP 24217 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  32   LWP 24218 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  33   LWP 24219 "rpc worker-2421" 0x00007f5349e80ad3 in ?? ()
  34   LWP 24220 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  35   LWP 24221 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  36   LWP 24222 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  37   LWP 24223 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  38   LWP 24224 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  39   LWP 24225 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  40   LWP 24226 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  41   LWP 24227 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  42   LWP 24228 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  43   LWP 24229 "rpc worker-2422" 0x00007f5349e80ad3 in ?? ()
  44   LWP 24230 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  45   LWP 24231 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  46   LWP 24232 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  47   LWP 24233 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  48   LWP 24234 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  49   LWP 24235 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  50   LWP 24236 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  51   LWP 24237 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  52   LWP 24238 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  53   LWP 24239 "rpc worker-2423" 0x00007f5349e80ad3 in ?? ()
  54   LWP 24240 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  55   LWP 24241 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  56   LWP 24242 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  57   LWP 24243 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  58   LWP 24244 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  59   LWP 24245 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  60   LWP 24246 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  61   LWP 24247 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  62   LWP 24248 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  63   LWP 24249 "rpc worker-2424" 0x00007f5349e80ad3 in ?? ()
  64   LWP 24250 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  65   LWP 24251 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  66   LWP 24252 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  67   LWP 24253 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  68   LWP 24254 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  69   LWP 24255 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  70   LWP 24256 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  71   LWP 24257 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  72   LWP 24258 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  73   LWP 24259 "rpc worker-2425" 0x00007f5349e80ad3 in ?? ()
  74   LWP 24260 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  75   LWP 24261 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  76   LWP 24262 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  77   LWP 24263 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  78   LWP 24264 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  79   LWP 24265 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  80   LWP 24266 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  81   LWP 24267 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  82   LWP 24268 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  83   LWP 24269 "rpc worker-2426" 0x00007f5349e80ad3 in ?? ()
  84   LWP 24270 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  85   LWP 24271 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  86   LWP 24272 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  87   LWP 24273 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  88   LWP 24274 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  89   LWP 24275 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  90   LWP 24276 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  91   LWP 24277 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  92   LWP 24278 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  93   LWP 24279 "rpc worker-2427" 0x00007f5349e80ad3 in ?? ()
  94   LWP 24280 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  95   LWP 24281 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  96   LWP 24282 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  97   LWP 24283 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  98   LWP 24284 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  99   LWP 24285 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  100  LWP 24286 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  101  LWP 24287 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  102  LWP 24288 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  103  LWP 24289 "rpc worker-2428" 0x00007f5349e80ad3 in ?? ()
  104  LWP 24290 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  105  LWP 24291 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  106  LWP 24292 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  107  LWP 24293 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  108  LWP 24294 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  109  LWP 24295 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  110  LWP 24296 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  111  LWP 24297 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  112  LWP 24298 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  113  LWP 24299 "rpc worker-2429" 0x00007f5349e80ad3 in ?? ()
  114  LWP 24300 "rpc worker-2430" 0x00007f5349e80ad3 in ?? ()
  115  LWP 24301 "rpc worker-2430" 0x00007f5349e80ad3 in ?? ()
  116  LWP 24302 "rpc worker-2430" 0x00007f5349e80ad3 in ?? ()
  117  LWP 24303 "diag-logger-243" 0x00007f5349e80fb9 in ?? ()
  118  LWP 24304 "result-tracker-" 0x00007f5349e80fb9 in ?? ()
  119  LWP 24305 "excess-log-dele" 0x00007f5349e80fb9 in ?? ()
  120  LWP 24306 "tcmalloc-memory" 0x00007f5349e80fb9 in ?? ()
  121  LWP 24307 "acceptor-24307" 0x00007f5347fa40c7 in ?? ()
  122  LWP 24308 "heartbeat-24308" 0x00007f5349e80fb9 in ?? ()
  123  LWP 24309 "maintenance_sch" 0x00007f5349e80fb9 in ?? ()

Thread 123 (LWP 24309):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000026 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a18050de50 in ?? ()
#5  0x00007f5301128470 in ?? ()
#6  0x000000000000004c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 24308):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180477930 in ?? ()
#5  0x00007f53019293f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 24307):
#0  0x00007f5347fa40c7 in ?? ()
#1  0x00007f530212a020 in ?? ()
#2  0x00007f5349b04c02 in ?? ()
#3  0x00007f530212a020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f530212a3e0 in ?? ()
#6  0x00007f530212a090 in ?? ()
#7  0x000055a1804330f8 in ?? ()
#8  0x00007f5349b0a699 in ?? ()
#9  0x00007f530212a510 in ?? ()
#10 0x00007f530212a700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f5349e843a7 in ?? ()
#13 0x00007f530212b520 in ?? ()
#14 0x00007f530212a260 in ?? ()
#15 0x000055a1804d30c0 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 24306):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd42b663a0 in ?? ()
#5  0x00007f530292b670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 24305):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 24304):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1803abb70 in ?? ()
#5  0x00007f530392d680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 24303):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1806b4690 in ?? ()
#5  0x00007f530412e550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 24302):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a180686ebc in ?? ()
#4  0x00007f530492f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f530492f5f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a180686ea8 in ?? ()
#9  0x00007f5349e80770 in ?? ()
#10 0x00007f530492f5f0 in ?? ()
#11 0x00007f530492f650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 24301):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055a180686e3c in ?? ()
#4  0x00007f53051305d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f53051305f0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055a180686e28 in ?? ()
#9  0x00007f5349e80770 in ?? ()
#10 0x00007f53051305f0 in ?? ()
#11 0x00007f5305130650 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 24300):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 24299):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 24298):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 24297):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 24296):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 24295):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 24294):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 24293):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 24292):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 24291):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 24290):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 24289):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 24288):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 24287):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 24286):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 24285):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 24284):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 24283):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 24282):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 24281):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 24280):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 24279):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 24278):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 24277):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 24276):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 24275):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 24274):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 24273):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 24272):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 24271):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 24270):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 24269):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 24268):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 24267):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 24266):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 24265):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 24264):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 24263):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 24262):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a1806778b8 in ?? ()
#4  0x00007f53189575d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f53189575f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 24261):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 24260):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 24259):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 24258):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 24257):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 24256):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 24255):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 24254):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 24253):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 24252):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 24251):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 24250):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 24249):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 24248):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 24247):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 24246):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 24245):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 24244):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 24243):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 24242):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a180676db8 in ?? ()
#4  0x00007f532296b5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f532296b5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 24241):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 24240):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 24239):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 24238):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 24237):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 24236):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 24235):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 24234):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 24233):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 24232):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 24231):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 24230):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 24229):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 24228):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 24227):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 24226):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 24225):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 24224):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 24223):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 24222):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055a1806762b8 in ?? ()
#4  0x00007f532c97f5d0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f532c97f5f0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 24221):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 24220):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 24219):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 24218):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 24217):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 24216):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 24215):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 24214):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 24213):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 24212):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 24211):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 24210):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 24209):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 24208):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 24207):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 24206):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 24205):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 24204):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 24203):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 24202):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 24201):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1803916c8 in ?? ()
#5  0x00007f53371946a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 24200):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 24199):
#0  0x00007f5349e80ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 24198):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f5338997680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180488e58 in ?? ()
#5  0x00007f53389976c0 in ?? ()
#6  0x00007f5338997840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb97c85bbd14000 in ?? ()
#10 0x000055a18047ac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a18047ac00 in ?? ()
#13 0x0000000080488e58 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f5338997720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97c85bbd14000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 24197):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f5339198680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180489a98 in ?? ()
#5  0x00007f53391986c0 in ?? ()
#6  0x00007f5339198840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb97be4ac218000 in ?? ()
#10 0x000055a180479600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a180479600 in ?? ()
#13 0x0000000080489a98 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f5339198720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97be4ac218000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 24196):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f5339999680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180489c58 in ?? ()
#5  0x00007f53399996c0 in ?? ()
#6  0x00007f5339999840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb9788b2377c000 in ?? ()
#10 0x000055a180479b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a180479b80 in ?? ()
#13 0x0000000080489c58 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688c in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f5339999720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb9788b2377c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 24195):
#0  0x00007f5347fa2a47 in ?? ()
#1  0x00007f533b584680 in ?? ()
#2  0x00007f53432a6571 in ?? ()
#3  0x00000000000002b8 in ?? ()
#4  0x000055a180489e18 in ?? ()
#5  0x00007f533b5846c0 in ?? ()
#6  0x00007f533b584840 in ?? ()
#7  0x000055a18052ac10 in ?? ()
#8  0x00007f53432a825d in ?? ()
#9  0x3fb972da1f084000 in ?? ()
#10 0x000055a18047a100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055a18047a100 in ?? ()
#13 0x0000000080489e18 in ?? ()
#14 0x000055a100000000 in ?? ()
#15 0x41da3ecc3e24688a in ?? ()
#16 0x000055a18052ac10 in ?? ()
#17 0x00007f533b584720 in ?? ()
#18 0x00007f53432acba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 24192):
#0  0x00007f5347f95cb9 in ?? ()
#1  0x00007f533cd87840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 24191):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 24190):
#0  0x00007f5349e849e2 in ?? ()
#1  0x000055a1803abee0 in ?? ()
#2  0x00007f533bd854d0 in ?? ()
#3  0x00007f533bd85450 in ?? ()
#4  0x00007f533bd85570 in ?? ()
#5  0x00007f533bd85790 in ?? ()
#6  0x00007f533bd857a0 in ?? ()
#7  0x00007f533bd854e0 in ?? ()
#8  0x00007f533bd854d0 in ?? ()
#9  0x000055a1803aa350 in ?? ()
#10 0x00007f534a26fc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 24184):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000030 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180530dc8 in ?? ()
#5  0x00007f533dd89430 in ?? ()
#6  0x0000000000000060 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 24183):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180390848 in ?? ()
#5  0x00007f533e58a790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 24182):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a1803902a8 in ?? ()
#5  0x00007f533ed8b790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 24181):
#0  0x00007f5349e80fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055a180390188 in ?? ()
#5  0x00007f533f58c790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 24179):
#0  0x00007f5349e84d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251024 08:17:06.072073 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 24045
I20251024 08:17:06.084118 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 23911
I20251024 08:17:06.096768 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 24311
I20251024 08:17:06.110723 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 24179
I20251024 08:17:06.116164 18753 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskNzynA4/build/release/bin/kudu with pid 19888
2025-10-24T08:17:06Z chronyd exiting
I20251024 08:17:06.133477 18753 test_util.cc:183] -----------------------------------------------
I20251024 08:17:06.133559 18753 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskNzynA4/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1761293763299348-18753-0
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-E0 B4-48 B6-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-E0 B4-48 B6-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-E2 B4-48 B6-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (47830 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (47830 ms total)

[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (62830 ms total)
[  PASSED  ] 1 test.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-E0 B4-48 B6-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-E0 B4-48 B6-55 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-E2 B4-48 B6-55 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>

 1 FAILED TEST
I20251024 08:17:06.134104 18753 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 18 messages since previous log ~10 seconds ago