Diagnosed failure

RollingRestartArgs/RollingRestartITest.TestWorkloads/4: /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:403: Failure
Failed
Timed out waiting for assertion to pass.
I20251212 21:11:33.110498 30111 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:33.113485 30376 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:33.114137 29976 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:33.542985 30244 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:34.215127 23994 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20251212 21:11:34.215240 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 35867ec45b8041d48fc8c7bb132375c5 and pid 29981
************************ BEGIN STACKS **************************
[New LWP 29983]
[New LWP 29984]
[New LWP 29985]
[New LWP 29986]
[New LWP 29992]
[New LWP 29993]
[New LWP 29994]
[New LWP 29997]
[New LWP 29998]
[New LWP 29999]
[New LWP 30000]
[New LWP 30001]
[New LWP 30002]
[New LWP 30004]
[New LWP 30005]
[New LWP 30006]
[New LWP 30007]
[New LWP 30008]
[New LWP 30009]
[New LWP 30010]
[New LWP 30011]
[New LWP 30012]
[New LWP 30013]
[New LWP 30014]
[New LWP 30015]
[New LWP 30016]
[New LWP 30017]
[New LWP 30018]
[New LWP 30019]
[New LWP 30020]
[New LWP 30021]
[New LWP 30022]
[New LWP 30023]
[New LWP 30024]
[New LWP 30025]
[New LWP 30026]
[New LWP 30027]
[New LWP 30028]
[New LWP 30029]
[New LWP 30030]
[New LWP 30031]
[New LWP 30032]
[New LWP 30033]
[New LWP 30034]
[New LWP 30035]
[New LWP 30036]
[New LWP 30037]
[New LWP 30038]
[New LWP 30039]
[New LWP 30040]
[New LWP 30041]
[New LWP 30042]
[New LWP 30043]
[New LWP 30044]
[New LWP 30045]
[New LWP 30046]
[New LWP 30047]
[New LWP 30048]
[New LWP 30049]
[New LWP 30050]
[New LWP 30051]
[New LWP 30052]
[New LWP 30053]
[New LWP 30054]
[New LWP 30055]
[New LWP 30056]
[New LWP 30057]
[New LWP 30058]
[New LWP 30059]
[New LWP 30060]
[New LWP 30061]
[New LWP 30062]
[New LWP 30063]
[New LWP 30064]
[New LWP 30065]
[New LWP 30066]
[New LWP 30067]
[New LWP 30068]
[New LWP 30069]
[New LWP 30070]
[New LWP 30071]
[New LWP 30072]
[New LWP 30073]
[New LWP 30074]
[New LWP 30075]
[New LWP 30076]
[New LWP 30077]
[New LWP 30078]
[New LWP 30079]
[New LWP 30080]
[New LWP 30081]
[New LWP 30082]
[New LWP 30083]
[New LWP 30084]
[New LWP 30085]
[New LWP 30086]
[New LWP 30087]
[New LWP 30088]
[New LWP 30089]
[New LWP 30090]
[New LWP 30091]
[New LWP 30092]
[New LWP 30093]
[New LWP 30094]
[New LWP 30095]
[New LWP 30096]
[New LWP 30097]
[New LWP 30098]
[New LWP 30099]
[New LWP 30100]
[New LWP 30101]
[New LWP 30102]
[New LWP 30103]
[New LWP 30104]
[New LWP 30105]
[New LWP 30106]
[New LWP 30107]
[New LWP 30108]
[New LWP 30109]
[New LWP 30110]
[New LWP 30111]
[New LWP 30112]
0x00007f2acb221d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 29981 "kudu"  0x00007f2acb221d50 in ?? ()
  2    LWP 29983 "kudu"  0x00007f2acb21dfb9 in ?? ()
  3    LWP 29984 "kudu"  0x00007f2acb21dfb9 in ?? ()
  4    LWP 29985 "kudu"  0x00007f2acb21dfb9 in ?? ()
  5    LWP 29986 "kernel-watcher-" 0x00007f2acb21dfb9 in ?? ()
  6    LWP 29992 "ntp client-2999" 0x00007f2acb2219e2 in ?? ()
  7    LWP 29993 "file cache-evic" 0x00007f2acb21dfb9 in ?? ()
  8    LWP 29994 "sq_acceptor" 0x00007f2ac9332cb9 in ?? ()
  9    LWP 29997 "rpc reactor-299" 0x00007f2ac933fa47 in ?? ()
  10   LWP 29998 "rpc reactor-299" 0x00007f2ac933fa47 in ?? ()
  11   LWP 29999 "rpc reactor-299" 0x00007f2ac933fa47 in ?? ()
  12   LWP 30000 "rpc reactor-300" 0x00007f2ac933fa47 in ?? ()
  13   LWP 30001 "MaintenanceMgr " 0x00007f2acb21dad3 in ?? ()
  14   LWP 30002 "txn-status-mana" 0x00007f2acb21dfb9 in ?? ()
  15   LWP 30004 "collect_and_rem" 0x00007f2acb21dfb9 in ?? ()
  16   LWP 30005 "tc-session-exp-" 0x00007f2acb21dfb9 in ?? ()
  17   LWP 30006 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  18   LWP 30007 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  19   LWP 30008 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  20   LWP 30009 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  21   LWP 30010 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  22   LWP 30011 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  23   LWP 30012 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  24   LWP 30013 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  25   LWP 30014 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  26   LWP 30015 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  27   LWP 30016 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  28   LWP 30017 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  29   LWP 30018 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  30   LWP 30019 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  31   LWP 30020 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  32   LWP 30021 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  33   LWP 30022 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  34   LWP 30023 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  35   LWP 30024 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  36   LWP 30025 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  37   LWP 30026 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  38   LWP 30027 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  39   LWP 30028 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  40   LWP 30029 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  41   LWP 30030 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  42   LWP 30031 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  43   LWP 30032 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  44   LWP 30033 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  45   LWP 30034 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  46   LWP 30035 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  47   LWP 30036 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  48   LWP 30037 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  49   LWP 30038 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  50   LWP 30039 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  51   LWP 30040 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  52   LWP 30041 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  53   LWP 30042 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  54   LWP 30043 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  55   LWP 30044 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  56   LWP 30045 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  57   LWP 30046 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  58   LWP 30047 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  59   LWP 30048 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  60   LWP 30049 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  61   LWP 30050 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  62   LWP 30051 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  63   LWP 30052 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  64   LWP 30053 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  65   LWP 30054 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  66   LWP 30055 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  67   LWP 30056 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  68   LWP 30057 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  69   LWP 30058 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  70   LWP 30059 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  71   LWP 30060 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  72   LWP 30061 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  73   LWP 30062 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  74   LWP 30063 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  75   LWP 30064 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  76   LWP 30065 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  77   LWP 30066 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  78   LWP 30067 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  79   LWP 30068 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  80   LWP 30069 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  81   LWP 30070 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  82   LWP 30071 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  83   LWP 30072 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  84   LWP 30073 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  85   LWP 30074 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  86   LWP 30075 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  87   LWP 30076 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  88   LWP 30077 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  89   LWP 30078 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  90   LWP 30079 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  91   LWP 30080 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  92   LWP 30081 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  93   LWP 30082 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  94   LWP 30083 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  95   LWP 30084 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  96   LWP 30085 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  97   LWP 30086 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  98   LWP 30087 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  99   LWP 30088 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  100  LWP 30089 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  101  LWP 30090 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  102  LWP 30091 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  103  LWP 30092 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  104  LWP 30093 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  105  LWP 30094 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  106  LWP 30095 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  107  LWP 30096 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  108  LWP 30097 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  109  LWP 30098 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  110  LWP 30099 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  111  LWP 30100 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  112  LWP 30101 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  113  LWP 30102 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  114  LWP 30103 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  115  LWP 30104 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  116  LWP 30105 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  117  LWP 30106 "diag-logger-301" 0x00007f2acb21dfb9 in ?? ()
  118  LWP 30107 "result-tracker-" 0x00007f2acb21dfb9 in ?? ()
  119  LWP 30108 "excess-log-dele" 0x00007f2acb21dfb9 in ?? ()
  120  LWP 30109 "tcmalloc-memory" 0x00007f2acb21dfb9 in ?? ()
  121  LWP 30110 "acceptor-30110" 0x00007f2ac93410c7 in ?? ()
  122  LWP 30111 "heartbeat-30111" 0x00007f2acb21dfb9 in ?? ()
  123  LWP 30112 "maintenance_sch" 0x00007f2acb21dfb9 in ?? ()

Thread 123 (LWP 30112):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c47a7e50 in ?? ()
#5  0x00007f2a81ccd470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 30111):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000a in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00005640c46f7934 in ?? ()
#5  0x00007f2a824ce3f0 in ?? ()
#6  0x0000000000000015 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f2a824ce410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f2a824ce470 in ?? ()
#12 0x00007f2acae91711 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 30110):
#0  0x00007f2ac93410c7 in ?? ()
#1  0x00007f2a82ccf020 in ?? ()
#2  0x00007f2acaea1ec2 in ?? ()
#3  0x00007f2a82ccf020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f2a82ccf3e0 in ?? ()
#6  0x00007f2a82ccf090 in ?? ()
#7  0x00005640c46a24c8 in ?? ()
#8  0x00007f2acaea7959 in ?? ()
#9  0x00007f2a82ccf510 in ?? ()
#10 0x00007f2a82ccf700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f2acb2213a7 in ?? ()
#13 0x00007f2a82cd0520 in ?? ()
#14 0x00007f2a82ccf260 in ?? ()
#15 0x00005640c4753140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 30109):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007fff3cdc9230 in ?? ()
#5  0x00007f2a834d0670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 30108):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 30107):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c462bb70 in ?? ()
#5  0x00007f2a844d2680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 30106):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c49a9390 in ?? ()
#5  0x00007f2a84cd3550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 30105):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 30104):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 30103):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 30102):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 30101):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c49ad83c in ?? ()
#4  0x00007f2a874d85c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2a874d85e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c49ad828 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2a874d85e0 in ?? ()
#11 0x00007f2a874d8640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 111 (LWP 30100):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 30099):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 30098):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 30097):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 30096):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 30095):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 30094):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 30093):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 30092):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 30091):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 30090):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 30089):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 30088):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 30087):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 30086):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c49ad8bc in ?? ()
#4  0x00007f2a8ece75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2a8ece75e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c49ad8a8 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2a8ece75e0 in ?? ()
#11 0x00007f2a8ece7640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 96 (LWP 30085):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 30084):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 30083):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 30082):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 30081):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 30080):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 30079):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 30078):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 30077):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 30076):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 30075):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 30074):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 30073):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 30072):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 30071):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 30070):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 30069):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 30068):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 30067):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 30066):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 30065):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 30064):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 30063):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 30062):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 30061):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 30060):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 30059):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 30058):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 30057):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 30056):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 30055):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 30054):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 30053):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 30052):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 30051):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 30050):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000247 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c4973c3c in ?? ()
#4  0x00007f2aa0d0b5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aa0d0b5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c4973c28 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2aa0d0b5e0 in ?? ()
#11 0x00007f2aa0d0b640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 60 (LWP 30049):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x00000000000002cf in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c4973bbc in ?? ()
#4  0x00007f2aa150c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aa150c5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c4973ba8 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2aa150c5e0 in ?? ()
#11 0x00007f2aa150c640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 59 (LWP 30048):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 30047):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 30046):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 30045):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005640c4972638 in ?? ()
#4  0x00007f2aa35105c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aa35105e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 30044):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 30043):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 30042):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 30041):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 30040):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 30039):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 30038):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 30037):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 30036):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 30035):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 30034):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 30033):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 30032):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 30031):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 30030):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 30029):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 30028):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 30027):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 30026):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 30025):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005640c4897b38 in ?? ()
#4  0x00007f2aad5245c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aad5245e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 30024):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000050 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005640c4897ab8 in ?? ()
#4  0x00007f2aadd255c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aadd255e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 34 (LWP 30023):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 30022):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 30021):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 30020):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 30019):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 30018):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 30017):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 30016):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 30015):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 30014):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 30013):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 30012):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 30011):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 30010):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 30009):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 30008):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 30007):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 30006):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 30005):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 30004):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c46116c8 in ?? ()
#5  0x00007f2ab7d396a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 30002):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 30001):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 30000):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2ab9d3d680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4709398 in ?? ()
#5  0x00007f2ab9d3d6c0 in ?? ()
#6  0x00007f2ab9d3d840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb96691efcec000 in ?? ()
#10 0x00005640c46fac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46fac00 in ?? ()
#13 0x00000000c4709398 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f9a0 in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2ab9d3d720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96691efcec000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 29999):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2aba53e680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4709018 in ?? ()
#5  0x00007f2aba53e6c0 in ?? ()
#6  0x00007f2aba53e840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb97033da19c000 in ?? ()
#10 0x00005640c46f9b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46f9b80 in ?? ()
#13 0x00000000c4709018 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f99c in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2aba53e720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97033da19c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 29998):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2abad3f680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4709558 in ?? ()
#5  0x00007f2abad3f6c0 in ?? ()
#6  0x00007f2abad3f840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb96404d5520000 in ?? ()
#10 0x00005640c46fa100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46fa100 in ?? ()
#13 0x00000000c4709558 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2abad3f720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96404d5520000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 29997):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2abc921680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4708e58 in ?? ()
#5  0x00007f2abc9216c0 in ?? ()
#6  0x00007f2abc921840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb9894f81618000 in ?? ()
#10 0x00005640c46f9600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46f9600 in ?? ()
#13 0x00000000c4708e58 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2abc921720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 29994):
#0  0x00007f2ac9332cb9 in ?? ()
#1  0x00007f2abe124840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 29993):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 29992):
#0  0x00007f2acb2219e2 in ?? ()
#1  0x00005640c462bee0 in ?? ()
#2  0x00007f2abd1224d0 in ?? ()
#3  0x00007f2abd122450 in ?? ()
#4  0x00007f2abd122570 in ?? ()
#5  0x00007f2abd122790 in ?? ()
#6  0x00007f2abd1227a0 in ?? ()
#7  0x00007f2abd1224e0 in ?? ()
#8  0x00007f2abd1224d0 in ?? ()
#9  0x00005640c462a350 in ?? ()
#10 0x00007f2acb60cc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 29986):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000029 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c47b0dc8 in ?? ()
#5  0x00007f2abf126430 in ?? ()
#6  0x0000000000000052 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 29985):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c4610848 in ?? ()
#5  0x00007f2abf927790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 29984):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c46102a8 in ?? ()
#5  0x00007f2ac0128790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 29983):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c4610188 in ?? ()
#5  0x00007f2ac0929790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 29981):
#0  0x00007f2acb221d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:34.734045 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID dd9e48fb810447718c09aca5a01b0fe3 and pid 29846
************************ BEGIN STACKS **************************
[New LWP 29848]
[New LWP 29849]
[New LWP 29850]
[New LWP 29851]
[New LWP 29857]
[New LWP 29858]
[New LWP 29859]
[New LWP 29862]
[New LWP 29863]
[New LWP 29864]
[New LWP 29865]
[New LWP 29866]
[New LWP 29867]
[New LWP 29869]
[New LWP 29870]
[New LWP 29871]
[New LWP 29872]
[New LWP 29873]
[New LWP 29874]
[New LWP 29875]
[New LWP 29876]
[New LWP 29877]
[New LWP 29878]
[New LWP 29879]
[New LWP 29880]
[New LWP 29881]
[New LWP 29882]
[New LWP 29883]
[New LWP 29884]
[New LWP 29885]
[New LWP 29886]
[New LWP 29887]
[New LWP 29888]
[New LWP 29889]
[New LWP 29890]
[New LWP 29891]
[New LWP 29892]
[New LWP 29893]
[New LWP 29894]
[New LWP 29895]
[New LWP 29896]
[New LWP 29897]
[New LWP 29898]
[New LWP 29899]
[New LWP 29900]
[New LWP 29901]
[New LWP 29902]
[New LWP 29903]
[New LWP 29904]
[New LWP 29905]
[New LWP 29906]
[New LWP 29907]
[New LWP 29908]
[New LWP 29909]
[New LWP 29910]
[New LWP 29911]
[New LWP 29912]
[New LWP 29913]
[New LWP 29914]
[New LWP 29915]
[New LWP 29916]
[New LWP 29917]
[New LWP 29918]
[New LWP 29919]
[New LWP 29920]
[New LWP 29921]
[New LWP 29922]
[New LWP 29923]
[New LWP 29924]
[New LWP 29925]
[New LWP 29926]
[New LWP 29927]
[New LWP 29928]
[New LWP 29929]
[New LWP 29930]
[New LWP 29931]
[New LWP 29932]
[New LWP 29933]
[New LWP 29934]
[New LWP 29935]
[New LWP 29936]
[New LWP 29937]
[New LWP 29938]
[New LWP 29939]
[New LWP 29940]
[New LWP 29941]
[New LWP 29942]
[New LWP 29943]
[New LWP 29944]
[New LWP 29945]
[New LWP 29946]
[New LWP 29947]
[New LWP 29948]
[New LWP 29949]
[New LWP 29950]
[New LWP 29951]
[New LWP 29952]
[New LWP 29953]
[New LWP 29954]
[New LWP 29955]
[New LWP 29956]
[New LWP 29957]
[New LWP 29958]
[New LWP 29959]
[New LWP 29960]
[New LWP 29961]
[New LWP 29962]
[New LWP 29963]
[New LWP 29964]
[New LWP 29965]
[New LWP 29966]
[New LWP 29967]
[New LWP 29968]
[New LWP 29969]
[New LWP 29970]
[New LWP 29971]
[New LWP 29972]
[New LWP 29973]
[New LWP 29974]
[New LWP 29975]
[New LWP 29976]
[New LWP 29977]
[New LWP 30455]
0x00007fdf36f12d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 29846 "kudu"  0x00007fdf36f12d50 in ?? ()
  2    LWP 29848 "kudu"  0x00007fdf36f0efb9 in ?? ()
  3    LWP 29849 "kudu"  0x00007fdf36f0efb9 in ?? ()
  4    LWP 29850 "kudu"  0x00007fdf36f0efb9 in ?? ()
  5    LWP 29851 "kernel-watcher-" 0x00007fdf36f0efb9 in ?? ()
  6    LWP 29857 "ntp client-2985" 0x00007fdf36f129e2 in ?? ()
  7    LWP 29858 "file cache-evic" 0x00007fdf36f0efb9 in ?? ()
  8    LWP 29859 "sq_acceptor" 0x00007fdf35023cb9 in ?? ()
  9    LWP 29862 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  10   LWP 29863 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  11   LWP 29864 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  12   LWP 29865 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  13   LWP 29866 "MaintenanceMgr " 0x00007fdf36f0ead3 in ?? ()
  14   LWP 29867 "txn-status-mana" 0x00007fdf36f0efb9 in ?? ()
  15   LWP 29869 "collect_and_rem" 0x00007fdf36f0efb9 in ?? ()
  16   LWP 29870 "tc-session-exp-" 0x00007fdf36f0efb9 in ?? ()
  17   LWP 29871 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  18   LWP 29872 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  19   LWP 29873 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  20   LWP 29874 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  21   LWP 29875 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  22   LWP 29876 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  23   LWP 29877 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  24   LWP 29878 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  25   LWP 29879 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  26   LWP 29880 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  27   LWP 29881 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  28   LWP 29882 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  29   LWP 29883 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  30   LWP 29884 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  31   LWP 29885 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  32   LWP 29886 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  33   LWP 29887 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  34   LWP 29888 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  35   LWP 29889 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  36   LWP 29890 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  37   LWP 29891 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  38   LWP 29892 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  39   LWP 29893 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  40   LWP 29894 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  41   LWP 29895 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  42   LWP 29896 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  43   LWP 29897 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  44   LWP 29898 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  45   LWP 29899 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  46   LWP 29900 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  47   LWP 29901 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  48   LWP 29902 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  49   LWP 29903 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  50   LWP 29904 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  51   LWP 29905 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  52   LWP 29906 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  53   LWP 29907 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  54   LWP 29908 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  55   LWP 29909 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  56   LWP 29910 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  57   LWP 29911 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  58   LWP 29912 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  59   LWP 29913 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  60   LWP 29914 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  61   LWP 29915 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  62   LWP 29916 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  63   LWP 29917 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  64   LWP 29918 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  65   LWP 29919 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  66   LWP 29920 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  67   LWP 29921 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  68   LWP 29922 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  69   LWP 29923 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  70   LWP 29924 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  71   LWP 29925 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  72   LWP 29926 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  73   LWP 29927 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  74   LWP 29928 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  75   LWP 29929 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  76   LWP 29930 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  77   LWP 29931 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  78   LWP 29932 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  79   LWP 29933 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  80   LWP 29934 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  81   LWP 29935 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  82   LWP 29936 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  83   LWP 29937 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  84   LWP 29938 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  85   LWP 29939 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  86   LWP 29940 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  87   LWP 29941 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  88   LWP 29942 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  89   LWP 29943 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  90   LWP 29944 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  91   LWP 29945 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  92   LWP 29946 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  93   LWP 29947 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  94   LWP 29948 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  95   LWP 29949 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  96   LWP 29950 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  97   LWP 29951 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  98   LWP 29952 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  99   LWP 29953 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  100  LWP 29954 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  101  LWP 29955 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  102  LWP 29956 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  103  LWP 29957 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  104  LWP 29958 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  105  LWP 29959 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  106  LWP 29960 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  107  LWP 29961 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  108  LWP 29962 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  109  LWP 29963 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  110  LWP 29964 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  111  LWP 29965 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  112  LWP 29966 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  113  LWP 29967 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  114  LWP 29968 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  115  LWP 29969 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  116  LWP 29970 "rpc worker-2997" 0x00007fdf36f0ead3 in ?? ()
  117  LWP 29971 "diag-logger-299" 0x00007fdf36f0efb9 in ?? ()
  118  LWP 29972 "result-tracker-" 0x00007fdf36f0efb9 in ?? ()
  119  LWP 29973 "excess-log-dele" 0x00007fdf36f0efb9 in ?? ()
  120  LWP 29974 "tcmalloc-memory" 0x00007fdf36f0efb9 in ?? ()
  121  LWP 29975 "acceptor-29975" 0x00007fdf350320c7 in ?? ()
  122  LWP 29976 "heartbeat-29976" 0x00007fdf36f0efb9 in ?? ()
  123  LWP 29977 "maintenance_sch" 0x00007fdf36f0efb9 in ?? ()
  124  LWP 30455 "raft [worker]-3" 0x00007fdf36f0efb9 in ?? ()

Thread 124 (LWP 30455):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x00000000000004e3 in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007fdee89b4764 in ?? ()
#5  0x00007fdee89b4510 in ?? ()
#6  0x00000000000009c7 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fdee89b4530 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fdee89b4590 in ?? ()
#12 0x00007fdf36b82711 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 123 (LWP 29977):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000023 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56d45e50 in ?? ()
#5  0x00007fdeed9be470 in ?? ()
#6  0x0000000000000046 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 29976):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56c95930 in ?? ()
#5  0x00007fdeee1bf3f0 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 29975):
#0  0x00007fdf350320c7 in ?? ()
#1  0x00007fdeee9c0020 in ?? ()
#2  0x00007fdf36b92ec2 in ?? ()
#3  0x00007fdeee9c0020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fdeee9c03e0 in ?? ()
#6  0x00007fdeee9c0090 in ?? ()
#7  0x0000561c56c404c8 in ?? ()
#8  0x00007fdf36b98959 in ?? ()
#9  0x00007fdeee9c0510 in ?? ()
#10 0x00007fdeee9c0700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fdf36f123a7 in ?? ()
#13 0x00007fdeee9c1520 in ?? ()
#14 0x00007fdeee9c0260 in ?? ()
#15 0x0000561c56cf1140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 29974):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe894cdd70 in ?? ()
#5  0x00007fdeef1c1670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 29973):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 29972):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bc9b70 in ?? ()
#5  0x00007fdef01c3680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 29971):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56f51390 in ?? ()
#5  0x00007fdef09c4550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 29970):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56f5273c in ?? ()
#4  0x00007fdef11c55c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdef11c55e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56f52728 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdef11c55e0 in ?? ()
#11 0x00007fdef11c5640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 29969):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56f526bc in ?? ()
#4  0x00007fdef19c65c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdef19c65e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56f526a8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdef19c65e0 in ?? ()
#11 0x00007fdef19c6640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 29968):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 29967):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 29966):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 29965):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 29964):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 29963):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 29962):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 29961):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 29960):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 29959):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 29958):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 29957):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 29956):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 29955):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 29954):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 29953):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 29952):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 29951):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 29950):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 29949):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 29948):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 29947):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 29946):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 29945):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 29944):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 29943):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 29942):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 29941):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 29940):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 29939):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 29938):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 29937):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 29936):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 29935):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 29934):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 29933):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 29932):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 29931):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 29930):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000004 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x0000561c56f1b138 in ?? ()
#4  0x00007fdf051ed5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf051ed5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 29929):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 29928):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 29927):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 29926):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 29925):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 29924):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 29923):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 29922):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 29921):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 29920):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 29919):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 29918):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 29917):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 29916):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 29915):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 29914):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 29913):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 29912):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 29911):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 29910):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x0000561c56f1a638 in ?? ()
#4  0x00007fdf0f2015c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf0f2015e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 29909):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 29908):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 29907):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 29906):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 29905):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 29904):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 29903):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 29902):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 29901):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 29900):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 29899):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 29898):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 29897):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 29896):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 29895):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 29894):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 29893):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 29892):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 29891):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 29890):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000330 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x0000561c56e35b38 in ?? ()
#4  0x00007fdf192155c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf192155e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 29889):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000585 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e35abc in ?? ()
#4  0x00007fdf19a165c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf19a165e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e35aa8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf19a165e0 in ?? ()
#11 0x00007fdf19a16640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 29888):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x00000000000019f3 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e35a3c in ?? ()
#4  0x00007fdf1a2175c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1a2175e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e35a28 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1a2175e0 in ?? ()
#11 0x00007fdf1a217640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 33 (LWP 29887):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x000000000000051d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e359bc in ?? ()
#4  0x00007fdf1aa185c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1aa185e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e359a8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1aa185e0 in ?? ()
#11 0x00007fdf1aa18640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 32 (LWP 29886):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x000000000000182f in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e3593c in ?? ()
#4  0x00007fdf1b2195c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1b2195e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e35928 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1b2195e0 in ?? ()
#11 0x00007fdf1b219640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 31 (LWP 29885):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x00000000000018c7 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e358bc in ?? ()
#4  0x00007fdf1ba1a5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1ba1a5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e358a8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1ba1a5e0 in ?? ()
#11 0x00007fdf1ba1a640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 30 (LWP 29884):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 29883):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 29882):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 29881):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 29880):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 29879):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 29878):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 29877):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 29876):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 29875):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 29874):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 29873):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 29872):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 29871):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 29870):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 29869):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56baf6c8 in ?? ()
#5  0x00007fdf23a2a6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 29867):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 29866):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 29865):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf25a2e680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca7398 in ?? ()
#5  0x00007fdf25a2e6c0 in ?? ()
#6  0x00007fdf25a2e840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3fb961304f278000 in ?? ()
#10 0x0000561c56c98c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c98c00 in ?? ()
#13 0x0000000056ca7398 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f99b in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf25a2e720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb961304f278000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 29864):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf2622f680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca7018 in ?? ()
#5  0x00007fdf2622f6c0 in ?? ()
#6  0x00007fdf2622f840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3faeadec9e098000 in ?? ()
#10 0x0000561c56c98680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c98680 in ?? ()
#13 0x0000000056ca7018 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f9a0 in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf2622f720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3faeadec9e098000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 29863):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf26a30680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca7558 in ?? ()
#5  0x00007fdf26a306c0 in ?? ()
#6  0x00007fdf26a30840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3f8a14f210ba0000 in ?? ()
#10 0x0000561c56c98100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c98100 in ?? ()
#13 0x0000000056ca7558 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f9a0 in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf26a30720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3f8a14f210ba0000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 29862):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf28612680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca6e58 in ?? ()
#5  0x00007fdf286126c0 in ?? ()
#6  0x00007fdf28612840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3fb95a821712c000 in ?? ()
#10 0x0000561c56c97600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c97600 in ?? ()
#13 0x0000000056ca6e58 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f99b in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf28612720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 29859):
#0  0x00007fdf35023cb9 in ?? ()
#1  0x00007fdf29e15840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 29858):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 29857):
#0  0x00007fdf36f129e2 in ?? ()
#1  0x0000561c56bc9ee0 in ?? ()
#2  0x00007fdf28e134d0 in ?? ()
#3  0x00007fdf28e13450 in ?? ()
#4  0x00007fdf28e13570 in ?? ()
#5  0x00007fdf28e13790 in ?? ()
#6  0x00007fdf28e137a0 in ?? ()
#7  0x00007fdf28e134e0 in ?? ()
#8  0x00007fdf28e134d0 in ?? ()
#9  0x0000561c56bc8350 in ?? ()
#10 0x00007fdf372fdc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 29851):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56d4edc8 in ?? ()
#5  0x00007fdf2ae17430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 29850):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bae848 in ?? ()
#5  0x00007fdf2b618790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 29849):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bae2a8 in ?? ()
#5  0x00007fdf2be19790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 29848):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bae188 in ?? ()
#5  0x00007fdf2c61a790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 29846):
#0  0x00007fdf36f12d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:35.254654 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID c941504e89314a6a868d59585d254b81 and pid 30247
************************ BEGIN STACKS **************************
[New LWP 30248]
[New LWP 30249]
[New LWP 30250]
[New LWP 30251]
[New LWP 30257]
[New LWP 30258]
[New LWP 30259]
[New LWP 30262]
[New LWP 30263]
[New LWP 30264]
[New LWP 30265]
[New LWP 30266]
[New LWP 30267]
[New LWP 30269]
[New LWP 30270]
[New LWP 30271]
[New LWP 30272]
[New LWP 30273]
[New LWP 30274]
[New LWP 30275]
[New LWP 30276]
[New LWP 30277]
[New LWP 30278]
[New LWP 30279]
[New LWP 30280]
[New LWP 30281]
[New LWP 30282]
[New LWP 30283]
[New LWP 30284]
[New LWP 30285]
[New LWP 30286]
[New LWP 30287]
[New LWP 30288]
[New LWP 30289]
[New LWP 30290]
[New LWP 30291]
[New LWP 30292]
[New LWP 30293]
[New LWP 30294]
[New LWP 30295]
[New LWP 30296]
[New LWP 30297]
[New LWP 30298]
[New LWP 30299]
[New LWP 30300]
[New LWP 30301]
[New LWP 30302]
[New LWP 30303]
[New LWP 30304]
[New LWP 30305]
[New LWP 30306]
[New LWP 30307]
[New LWP 30308]
[New LWP 30309]
[New LWP 30310]
[New LWP 30311]
[New LWP 30312]
[New LWP 30313]
[New LWP 30314]
[New LWP 30315]
[New LWP 30316]
[New LWP 30317]
[New LWP 30318]
[New LWP 30319]
[New LWP 30320]
[New LWP 30321]
[New LWP 30322]
[New LWP 30323]
[New LWP 30324]
[New LWP 30325]
[New LWP 30326]
[New LWP 30327]
[New LWP 30328]
[New LWP 30329]
[New LWP 30330]
[New LWP 30331]
[New LWP 30332]
[New LWP 30333]
[New LWP 30334]
[New LWP 30335]
[New LWP 30336]
[New LWP 30337]
[New LWP 30338]
[New LWP 30339]
[New LWP 30340]
[New LWP 30341]
[New LWP 30342]
[New LWP 30343]
[New LWP 30344]
[New LWP 30345]
[New LWP 30346]
[New LWP 30347]
[New LWP 30348]
[New LWP 30349]
[New LWP 30350]
[New LWP 30351]
[New LWP 30352]
[New LWP 30353]
[New LWP 30354]
[New LWP 30355]
[New LWP 30356]
[New LWP 30357]
[New LWP 30358]
[New LWP 30359]
[New LWP 30360]
[New LWP 30361]
[New LWP 30362]
[New LWP 30363]
[New LWP 30364]
[New LWP 30365]
[New LWP 30366]
[New LWP 30367]
[New LWP 30368]
[New LWP 30369]
[New LWP 30370]
[New LWP 30371]
[New LWP 30372]
[New LWP 30373]
[New LWP 30374]
[New LWP 30375]
[New LWP 30376]
[New LWP 30377]
0x00007fe29a81bd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 30247 "kudu"  0x00007fe29a81bd50 in ?? ()
  2    LWP 30248 "kudu"  0x00007fe29a817fb9 in ?? ()
  3    LWP 30249 "kudu"  0x00007fe29a817fb9 in ?? ()
  4    LWP 30250 "kudu"  0x00007fe29a817fb9 in ?? ()
  5    LWP 30251 "kernel-watcher-" 0x00007fe29a817fb9 in ?? ()
  6    LWP 30257 "ntp client-3025" 0x00007fe29a81b9e2 in ?? ()
  7    LWP 30258 "file cache-evic" 0x00007fe29a817fb9 in ?? ()
  8    LWP 30259 "sq_acceptor" 0x00007fe29892ccb9 in ?? ()
  9    LWP 30262 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  10   LWP 30263 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  11   LWP 30264 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  12   LWP 30265 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  13   LWP 30266 "MaintenanceMgr " 0x00007fe29a817ad3 in ?? ()
  14   LWP 30267 "txn-status-mana" 0x00007fe29a817fb9 in ?? ()
  15   LWP 30269 "collect_and_rem" 0x00007fe29a817fb9 in ?? ()
  16   LWP 30270 "tc-session-exp-" 0x00007fe29a817fb9 in ?? ()
  17   LWP 30271 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  18   LWP 30272 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  19   LWP 30273 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  20   LWP 30274 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  21   LWP 30275 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  22   LWP 30276 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  23   LWP 30277 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  24   LWP 30278 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  25   LWP 30279 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  26   LWP 30280 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  27   LWP 30281 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  28   LWP 30282 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  29   LWP 30283 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  30   LWP 30284 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  31   LWP 30285 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  32   LWP 30286 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  33   LWP 30287 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  34   LWP 30288 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  35   LWP 30289 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  36   LWP 30290 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  37   LWP 30291 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  38   LWP 30292 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  39   LWP 30293 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  40   LWP 30294 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  41   LWP 30295 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  42   LWP 30296 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  43   LWP 30297 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  44   LWP 30298 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  45   LWP 30299 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  46   LWP 30300 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  47   LWP 30301 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  48   LWP 30302 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  49   LWP 30303 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  50   LWP 30304 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  51   LWP 30305 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  52   LWP 30306 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  53   LWP 30307 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  54   LWP 30308 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  55   LWP 30309 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  56   LWP 30310 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  57   LWP 30311 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  58   LWP 30312 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  59   LWP 30313 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  60   LWP 30314 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  61   LWP 30315 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  62   LWP 30316 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  63   LWP 30317 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  64   LWP 30318 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  65   LWP 30319 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  66   LWP 30320 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  67   LWP 30321 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  68   LWP 30322 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  69   LWP 30323 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  70   LWP 30324 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  71   LWP 30325 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  72   LWP 30326 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  73   LWP 30327 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  74   LWP 30328 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  75   LWP 30329 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  76   LWP 30330 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  77   LWP 30331 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  78   LWP 30332 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  79   LWP 30333 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  80   LWP 30334 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  81   LWP 30335 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  82   LWP 30336 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  83   LWP 30337 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  84   LWP 30338 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  85   LWP 30339 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  86   LWP 30340 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  87   LWP 30341 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  88   LWP 30342 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  89   LWP 30343 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  90   LWP 30344 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  91   LWP 30345 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  92   LWP 30346 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  93   LWP 30347 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  94   LWP 30348 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  95   LWP 30349 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  96   LWP 30350 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  97   LWP 30351 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  98   LWP 30352 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  99   LWP 30353 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  100  LWP 30354 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  101  LWP 30355 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  102  LWP 30356 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  103  LWP 30357 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  104  LWP 30358 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  105  LWP 30359 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  106  LWP 30360 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  107  LWP 30361 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  108  LWP 30362 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  109  LWP 30363 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  110  LWP 30364 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  111  LWP 30365 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  112  LWP 30366 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  113  LWP 30367 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  114  LWP 30368 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  115  LWP 30369 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  116  LWP 30370 "rpc worker-3037" 0x00007fe29a817ad3 in ?? ()
  117  LWP 30371 "diag-logger-303" 0x00007fe29a817fb9 in ?? ()
  118  LWP 30372 "result-tracker-" 0x00007fe29a817fb9 in ?? ()
  119  LWP 30373 "excess-log-dele" 0x00007fe29a817fb9 in ?? ()
  120  LWP 30374 "tcmalloc-memory" 0x00007fe29a817fb9 in ?? ()
  121  LWP 30375 "acceptor-30375" 0x00007fe29893b0c7 in ?? ()
  122  LWP 30376 "heartbeat-30376" 0x00007fe29a817fb9 in ?? ()
  123  LWP 30377 "maintenance_sch" 0x00007fe29a817fb9 in ?? ()

Thread 123 (LWP 30377):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000023 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438e13e50 in ?? ()
#5  0x00007fe2512c7470 in ?? ()
#6  0x0000000000000046 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 30376):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000a in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055b438d63934 in ?? ()
#5  0x00007fe251ac83f0 in ?? ()
#6  0x0000000000000015 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fe251ac8410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fe251ac8470 in ?? ()
#12 0x00007fe29a48b711 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 30375):
#0  0x00007fe29893b0c7 in ?? ()
#1  0x00007fe2522c9020 in ?? ()
#2  0x00007fe29a49bec2 in ?? ()
#3  0x00007fe2522c9020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fe2522c93e0 in ?? ()
#6  0x00007fe2522c9090 in ?? ()
#7  0x000055b438d0e4c8 in ?? ()
#8  0x00007fe29a4a1959 in ?? ()
#9  0x00007fe2522c9510 in ?? ()
#10 0x00007fe2522c9700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fe29a81b3a7 in ?? ()
#13 0x00007fe2522ca520 in ?? ()
#14 0x00007fe2522c9260 in ?? ()
#15 0x000055b438dbf140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 30374):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd3168eac0 in ?? ()
#5  0x00007fe252aca670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 30373):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 30372):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c97b70 in ?? ()
#5  0x00007fe253acc680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 30371):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b439010790 in ?? ()
#5  0x00007fe2542cd550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 30370):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b4390196bc in ?? ()
#4  0x00007fe254ace5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe254ace5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b4390196a8 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe254ace5e0 in ?? ()
#11 0x00007fe254ace640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 30369):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b43901963c in ?? ()
#4  0x00007fe2552cf5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe2552cf5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b439019628 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe2552cf5e0 in ?? ()
#11 0x00007fe2552cf640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 30368):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 30367):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 30366):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 30365):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 30364):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 30363):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 30362):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 30361):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 30360):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 30359):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 30358):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 30357):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 30356):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 30355):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 30354):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 30353):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 30352):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 30351):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 30350):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 30349):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 30348):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 30347):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 30346):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 30345):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 30344):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 30343):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 30342):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 30341):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 30340):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 30339):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 30338):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 30337):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 30336):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 30335):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 30334):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 30333):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 30332):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 30331):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 30330):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000285 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b4390180bc in ?? ()
#4  0x00007fe268af65c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe268af65e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b4390180a8 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe268af65e0 in ?? ()
#11 0x00007fe268af6640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 30329):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x00000000000002d2 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055b439018038 in ?? ()
#4  0x00007fe2692f75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe2692f75e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 74 (LWP 30328):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 30327):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 30326):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 30325):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 30324):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 30323):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 30322):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 30321):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 30320):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 30319):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 30318):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 30317):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 30316):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 30315):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 30314):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 30313):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 30312):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 30311):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 30310):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055b4390155b8 in ?? ()
#4  0x00007fe272b0a5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe272b0a5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 30309):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 30308):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 30307):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 30306):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 30305):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 30304):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 30303):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 30302):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 30301):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 30300):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 30299):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 30298):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 30297):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 30296):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 30295):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 30294):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 30293):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 30292):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 30291):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 30290):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000004 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055b439014c38 in ?? ()
#4  0x00007fe27cb1e5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe27cb1e5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 30289):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000045 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b439014bbc in ?? ()
#4  0x00007fe27d31f5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe27d31f5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b439014ba8 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe27d31f5e0 in ?? ()
#11 0x00007fe27d31f640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 30288):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 30287):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 30286):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 30285):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 30284):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 30283):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 30282):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 30281):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 30280):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 30279):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 30278):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 30277):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 30276):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 30275):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 30274):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 30273):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 30272):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 30271):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 30270):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 30269):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7d6c8 in ?? ()
#5  0x00007fe2873336a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 30267):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 30266):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 30265):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe289337680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d75398 in ?? ()
#5  0x00007fe2893376c0 in ?? ()
#6  0x00007fe289337840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fb95f707de34000 in ?? ()
#10 0x000055b438d66c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d66c00 in ?? ()
#13 0x0000000038d75398 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe289337720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95f707de34000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 30264):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe289b38680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d75018 in ?? ()
#5  0x00007fe289b386c0 in ?? ()
#6  0x00007fe289b38840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fb98e5282698000 in ?? ()
#10 0x000055b438d66680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d66680 in ?? ()
#13 0x0000000038d75018 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99c in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe289b38720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98e5282698000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 30263):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe28a339680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d75558 in ?? ()
#5  0x00007fe28a3396c0 in ?? ()
#6  0x00007fe28a339840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fa7f016a04c8000 in ?? ()
#10 0x000055b438d65600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d65600 in ?? ()
#13 0x0000000038d75558 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99b in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe28a339720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa7f016a04c8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 30262):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe28bf1b680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d74e58 in ?? ()
#5  0x00007fe28bf1b6c0 in ?? ()
#6  0x00007fe28bf1b840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fb95279f55b8000 in ?? ()
#10 0x000055b438d65b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d65b80 in ?? ()
#13 0x0000000038d74e58 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe28bf1b720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 30259):
#0  0x00007fe29892ccb9 in ?? ()
#1  0x00007fe28d71e840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 30258):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 30257):
#0  0x00007fe29a81b9e2 in ?? ()
#1  0x000055b438c97ee0 in ?? ()
#2  0x00007fe28c71c4d0 in ?? ()
#3  0x00007fe28c71c450 in ?? ()
#4  0x00007fe28c71c570 in ?? ()
#5  0x00007fe28c71c790 in ?? ()
#6  0x00007fe28c71c7a0 in ?? ()
#7  0x00007fe28c71c4e0 in ?? ()
#8  0x00007fe28c71c4d0 in ?? ()
#9  0x000055b438c96350 in ?? ()
#10 0x00007fe29ac06c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 30251):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438e1cdc8 in ?? ()
#5  0x00007fe28e720430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 30250):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7c848 in ?? ()
#5  0x00007fe28ef21790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 30249):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7c2a8 in ?? ()
#5  0x00007fe28f722790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 30248):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7c188 in ?? ()
#5  0x00007fe28ff23790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 30247):
#0  0x00007fe29a81bd50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:35.770576 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID 3d78cc34680848ddacf9620033efe712 and pid 30114
************************ BEGIN STACKS **************************
[New LWP 30117]
[New LWP 30118]
[New LWP 30119]
[New LWP 30120]
[New LWP 30126]
[New LWP 30127]
[New LWP 30128]
[New LWP 30131]
[New LWP 30132]
[New LWP 30133]
[New LWP 30134]
[New LWP 30135]
[New LWP 30136]
[New LWP 30137]
[New LWP 30138]
[New LWP 30139]
[New LWP 30140]
[New LWP 30141]
[New LWP 30142]
[New LWP 30143]
[New LWP 30144]
[New LWP 30145]
[New LWP 30146]
[New LWP 30147]
[New LWP 30148]
[New LWP 30149]
[New LWP 30150]
[New LWP 30151]
[New LWP 30152]
[New LWP 30153]
[New LWP 30154]
[New LWP 30155]
[New LWP 30156]
[New LWP 30157]
[New LWP 30158]
[New LWP 30159]
[New LWP 30160]
[New LWP 30161]
[New LWP 30162]
[New LWP 30163]
[New LWP 30164]
[New LWP 30165]
[New LWP 30166]
[New LWP 30167]
[New LWP 30168]
[New LWP 30169]
[New LWP 30170]
[New LWP 30171]
[New LWP 30172]
[New LWP 30173]
[New LWP 30174]
[New LWP 30175]
[New LWP 30176]
[New LWP 30177]
[New LWP 30178]
[New LWP 30179]
[New LWP 30180]
[New LWP 30181]
[New LWP 30182]
[New LWP 30183]
[New LWP 30184]
[New LWP 30185]
[New LWP 30186]
[New LWP 30187]
[New LWP 30188]
[New LWP 30189]
[New LWP 30190]
[New LWP 30191]
[New LWP 30192]
[New LWP 30193]
[New LWP 30194]
[New LWP 30195]
[New LWP 30196]
[New LWP 30197]
[New LWP 30198]
[New LWP 30199]
[New LWP 30200]
[New LWP 30201]
[New LWP 30202]
[New LWP 30203]
[New LWP 30204]
[New LWP 30205]
[New LWP 30206]
[New LWP 30207]
[New LWP 30208]
[New LWP 30209]
[New LWP 30210]
[New LWP 30211]
[New LWP 30212]
[New LWP 30213]
[New LWP 30214]
[New LWP 30215]
[New LWP 30216]
[New LWP 30217]
[New LWP 30218]
[New LWP 30219]
[New LWP 30220]
[New LWP 30221]
[New LWP 30222]
[New LWP 30223]
[New LWP 30224]
[New LWP 30225]
[New LWP 30226]
[New LWP 30227]
[New LWP 30228]
[New LWP 30229]
[New LWP 30230]
[New LWP 30231]
[New LWP 30232]
[New LWP 30233]
[New LWP 30234]
[New LWP 30235]
[New LWP 30236]
[New LWP 30237]
[New LWP 30238]
[New LWP 30239]
[New LWP 30240]
[New LWP 30241]
[New LWP 30242]
[New LWP 30243]
[New LWP 30244]
[New LWP 30245]
0x00007f92bd413d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 30114 "kudu"  0x00007f92bd413d50 in ?? ()
  2    LWP 30117 "kudu"  0x00007f92bd40ffb9 in ?? ()
  3    LWP 30118 "kudu"  0x00007f92bd40ffb9 in ?? ()
  4    LWP 30119 "kudu"  0x00007f92bd40ffb9 in ?? ()
  5    LWP 30120 "kernel-watcher-" 0x00007f92bd40ffb9 in ?? ()
  6    LWP 30126 "ntp client-3012" 0x00007f92bd4139e2 in ?? ()
  7    LWP 30127 "file cache-evic" 0x00007f92bd40ffb9 in ?? ()
  8    LWP 30128 "sq_acceptor" 0x00007f92bb524cb9 in ?? ()
  9    LWP 30131 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  10   LWP 30132 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  11   LWP 30133 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  12   LWP 30134 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  13   LWP 30135 "MaintenanceMgr " 0x00007f92bd40fad3 in ?? ()
  14   LWP 30136 "txn-status-mana" 0x00007f92bd40ffb9 in ?? ()
  15   LWP 30137 "collect_and_rem" 0x00007f92bd40ffb9 in ?? ()
  16   LWP 30138 "tc-session-exp-" 0x00007f92bd40ffb9 in ?? ()
  17   LWP 30139 "rpc worker-3013" 0x00007f92bd40fad3 in ?? ()
  18   LWP 30140 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  19   LWP 30141 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  20   LWP 30142 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  21   LWP 30143 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  22   LWP 30144 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  23   LWP 30145 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  24   LWP 30146 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  25   LWP 30147 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  26   LWP 30148 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  27   LWP 30149 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  28   LWP 30150 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  29   LWP 30151 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  30   LWP 30152 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  31   LWP 30153 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  32   LWP 30154 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  33   LWP 30155 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  34   LWP 30156 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  35   LWP 30157 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  36   LWP 30158 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  37   LWP 30159 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  38   LWP 30160 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  39   LWP 30161 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  40   LWP 30162 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  41   LWP 30163 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  42   LWP 30164 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  43   LWP 30165 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  44   LWP 30166 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  45   LWP 30167 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  46   LWP 30168 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  47   LWP 30169 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  48   LWP 30170 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  49   LWP 30171 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  50   LWP 30172 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  51   LWP 30173 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  52   LWP 30174 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  53   LWP 30175 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  54   LWP 30176 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  55   LWP 30177 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  56   LWP 30178 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  57   LWP 30179 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  58   LWP 30180 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  59   LWP 30181 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  60   LWP 30182 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  61   LWP 30183 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  62   LWP 30184 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  63   LWP 30185 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  64   LWP 30186 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  65   LWP 30187 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  66   LWP 30188 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  67   LWP 30189 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  68   LWP 30190 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  69   LWP 30191 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  70   LWP 30192 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  71   LWP 30193 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  72   LWP 30194 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  73   LWP 30195 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  74   LWP 30196 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  75   LWP 30197 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  76   LWP 30198 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  77   LWP 30199 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  78   LWP 30200 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  79   LWP 30201 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  80   LWP 30202 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  81   LWP 30203 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  82   LWP 30204 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  83   LWP 30205 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  84   LWP 30206 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  85   LWP 30207 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  86   LWP 30208 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  87   LWP 30209 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  88   LWP 30210 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  89   LWP 30211 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  90   LWP 30212 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  91   LWP 30213 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  92   LWP 30214 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  93   LWP 30215 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  94   LWP 30216 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  95   LWP 30217 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  96   LWP 30218 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  97   LWP 30219 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  98   LWP 30220 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  99   LWP 30221 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  100  LWP 30222 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  101  LWP 30223 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  102  LWP 30224 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  103  LWP 30225 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  104  LWP 30226 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  105  LWP 30227 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  106  LWP 30228 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  107  LWP 30229 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  108  LWP 30230 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  109  LWP 30231 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  110  LWP 30232 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  111  LWP 30233 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  112  LWP 30234 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  113  LWP 30235 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  114  LWP 30236 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  115  LWP 30237 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  116  LWP 30238 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  117  LWP 30239 "diag-logger-302" 0x00007f92bd40ffb9 in ?? ()
  118  LWP 30240 "result-tracker-" 0x00007f92bd40ffb9 in ?? ()
  119  LWP 30241 "excess-log-dele" 0x00007f92bd40ffb9 in ?? ()
  120  LWP 30242 "tcmalloc-memory" 0x00007f92bd40ffb9 in ?? ()
  121  LWP 30243 "acceptor-30243" 0x00007f92bb5330c7 in ?? ()
  122  LWP 30244 "heartbeat-30244" 0x00007f92bd40ffb9 in ?? ()
  123  LWP 30245 "maintenance_sch" 0x00007f92bd40ffb9 in ?? ()

Thread 123 (LWP 30245):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000026 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cb8de50 in ?? ()
#5  0x00007f92746c0470 in ?? ()
#6  0x000000000000004c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 30244):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cadd930 in ?? ()
#5  0x00007f9274ec13f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 30243):
#0  0x00007f92bb5330c7 in ?? ()
#1  0x00007f92756c2020 in ?? ()
#2  0x00007f92bd093ec2 in ?? ()
#3  0x00007f92756c2020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f92756c23e0 in ?? ()
#6  0x00007f92756c2090 in ?? ()
#7  0x000056022ca884c8 in ?? ()
#8  0x00007f92bd099959 in ?? ()
#9  0x00007f92756c2510 in ?? ()
#10 0x00007f92756c2700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f92bd4133a7 in ?? ()
#13 0x00007f92756c3520 in ?? ()
#14 0x00007f92756c2260 in ?? ()
#15 0x000056022cb39140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 30242):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc5a0aa6b0 in ?? ()
#5  0x00007f9275ec3670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 30241):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 30240):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022ca11b70 in ?? ()
#5  0x00007f9276ec5680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 30239):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cd1c690 in ?? ()
#5  0x00007f92776c6550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 30238):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 30237):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 30236):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 30235):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 30234):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 30233):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 30232):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 30231):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 30230):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 30229):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 30228):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 30227):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 30226):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 30225):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 30224):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 30223):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 30222):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000006 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19db8 in ?? ()
#4  0x00007f927fed75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f927fed75e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 99 (LWP 30221):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 30220):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19d38 in ?? ()
#4  0x00007f9280ed95c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9280ed95e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 97 (LWP 30219):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 30218):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 30217):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 30216):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 30215):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 30214):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 30213):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 30212):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 30211):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 30210):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 30209):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 30208):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 30207):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 30206):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 30205):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 30204):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 30203):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 30202):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 30201):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 30200):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 30199):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 30198):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 30197):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 30196):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 30195):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19b38 in ?? ()
#4  0x00007f928d6f25c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f928d6f25e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 72 (LWP 30194):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 30193):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 30192):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 30191):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 30190):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 30189):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 30188):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 30187):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 30186):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 30185):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 30184):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 30183):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 30182):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 30181):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 30180):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 30179):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 30178):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 30177):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 30176):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 30175):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 30174):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 30173):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 30172):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 30171):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 30170):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 30169):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 30168):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 30167):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 30166):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 30165):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 30164):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 30163):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 30162):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 30161):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 30160):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19a38 in ?? ()
#4  0x00007f929ef155c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f929ef155e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 37 (LWP 30159):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 30158):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 30157):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 30156):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 30155):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 30154):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 30153):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 30152):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 30151):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 30150):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 30149):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 30148):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 30147):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 30146):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 30145):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 30144):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 30143):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 30142):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 30141):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 30140):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19eb8 in ?? ()
#4  0x00007f92a8f295c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f92a8f295e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 17 (LWP 30139):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 30138):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 30137):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f76c8 in ?? ()
#5  0x00007f92aa72c6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 30136):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 30135):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 30134):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92abf2f680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caef398 in ?? ()
#5  0x00007f92abf2f6c0 in ?? ()
#6  0x00007f92abf2f840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb957e31e71c000 in ?? ()
#10 0x000056022cae0c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cae0c00 in ?? ()
#13 0x000000002caef398 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99e in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92abf2f720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb957e31e71c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 30133):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92ac730680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caef018 in ?? ()
#5  0x00007f92ac7306c0 in ?? ()
#6  0x00007f92ac730840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb98d125dd80000 in ?? ()
#10 0x000056022cadf600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cadf600 in ?? ()
#13 0x000000002caef018 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99e in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92ac730720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98d125dd80000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 30132):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92acf31680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caef558 in ?? ()
#5  0x00007f92acf316c0 in ?? ()
#6  0x00007f92acf31840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb98b3382e30000 in ?? ()
#10 0x000056022cadfb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cadfb80 in ?? ()
#13 0x000000002caef558 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99d in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92acf31720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98b3382e30000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 30131):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92aeb13680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caeee58 in ?? ()
#5  0x00007f92aeb136c0 in ?? ()
#6  0x00007f92aeb13840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb95adf9f838000 in ?? ()
#10 0x000056022cae0680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cae0680 in ?? ()
#13 0x000000002caeee58 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99d in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92aeb13720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 30128):
#0  0x00007f92bb524cb9 in ?? ()
#1  0x00007f92b0316840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 30127):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 30126):
#0  0x00007f92bd4139e2 in ?? ()
#1  0x000056022ca11ee0 in ?? ()
#2  0x00007f92af3144d0 in ?? ()
#3  0x00007f92af314450 in ?? ()
#4  0x00007f92af314570 in ?? ()
#5  0x00007f92af314790 in ?? ()
#6  0x00007f92af3147a0 in ?? ()
#7  0x00007f92af3144e0 in ?? ()
#8  0x00007f92af3144d0 in ?? ()
#9  0x000056022ca10350 in ?? ()
#10 0x00007f92bd7fec6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 30120):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000030 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cb96dc8 in ?? ()
#5  0x00007f92b1318430 in ?? ()
#6  0x0000000000000060 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 30119):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f6848 in ?? ()
#5  0x00007f92b1b19790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 30118):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f62a8 in ?? ()
#5  0x00007f92b231a790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 30117):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f6188 in ?? ()
#5  0x00007f92b2b1b790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 30114):
#0  0x00007f92bd413d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:36.280943 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29981
I20251212 21:11:36.291064 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29846
I20251212 21:11:36.302075 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 30247
I20251212 21:11:36.312151 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 30114
I20251212 21:11:36.317694 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25090
2025-12-12T21:11:36Z chronyd exiting
I20251212 21:11:36.334146 23994 test_util.cc:183] -----------------------------------------------
I20251212 21:11:36.334224 23994 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0

Full log

Note: This is test shard 1 of 8.
[==========] Running 2 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 1 test from MaintenanceModeRF3ITest
[ RUN      ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate
2025-12-12T21:10:31Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-12-12T21:10:31Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20251212 21:10:31.649343 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.23.110.190:46669
--webserver_interface=127.23.110.190
--webserver_port=0
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.23.110.190:46669 with env {}
W20251212 21:10:31.730477 24002 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:31.730669 24002 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:31.730690 24002 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:31.732159 24002 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251212 21:10:31.732209 24002 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:31.732221 24002 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251212 21:10:31.732232 24002 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251212 21:10:31.733940 24002 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.23.110.190:46669
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.23.110.190:46669
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.23.110.190
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.24002
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:31.734151 24002 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:31.734357 24002 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:31.737404 24010 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:31.737435 24007 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:31.737521 24008 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:31.737609 24002 server_base.cc:1047] running on GCE node
I20251212 21:10:31.738090 24002 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:31.738392 24002 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:31.739560 24002 hybrid_clock.cc:648] HybridClock initialized: now 1765573831739539 us; error 37 us; skew 500 ppm
I20251212 21:10:31.740892 24002 webserver.cc:492] Webserver started at http://127.23.110.190:43745/ using document root <none> and password file <none>
I20251212 21:10:31.741110 24002 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:31.741155 24002 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:31.741307 24002 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:31.742209 24002 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/instance:
uuid: "68cde708327e4da5bdb1432082ba3f7b"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:31.742501 24002 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal/instance:
uuid: "68cde708327e4da5bdb1432082ba3f7b"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:31.743759 24002 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251212 21:10:31.744560 24016 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:31.744769 24002 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:31.744853 24002 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
uuid: "68cde708327e4da5bdb1432082ba3f7b"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:31.744912 24002 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:31.759889 24002 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:31.760176 24002 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:31.760281 24002 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:31.764144 24002 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.190:46669
I20251212 21:10:31.764173 24068 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.190:46669 every 8 connection(s)
I20251212 21:10:31.764501 24002 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
I20251212 21:10:31.765166 24069 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:31.767530 24069 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Bootstrap starting.
I20251212 21:10:31.768117 24069 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:31.768368 24069 log.cc:826] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:31.768975 24069 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: No bootstrap required, opened a new log
I20251212 21:10:31.770283 24069 raft_consensus.cc:359] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:31.770422 24069 raft_consensus.cc:385] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:31.770442 24069 raft_consensus.cc:740] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 68cde708327e4da5bdb1432082ba3f7b, State: Initialized, Role: FOLLOWER
I20251212 21:10:31.770555 24069 consensus_queue.cc:260] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:31.770618 24069 raft_consensus.cc:399] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251212 21:10:31.770646 24069 raft_consensus.cc:493] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251212 21:10:31.770678 24069 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:31.771207 24069 raft_consensus.cc:515] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:31.771309 24069 leader_election.cc:304] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 68cde708327e4da5bdb1432082ba3f7b; no voters: 
I20251212 21:10:31.771466 24069 leader_election.cc:290] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [CANDIDATE]: Term 1 election: Requested vote from peers 
I20251212 21:10:31.771610 24072 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:31.771689 24069 sys_catalog.cc:565] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: configured and running, proceeding with master startup.
I20251212 21:10:31.771754 24072 raft_consensus.cc:697] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 LEADER]: Becoming Leader. State: Replica: 68cde708327e4da5bdb1432082ba3f7b, State: Running, Role: LEADER
I20251212 21:10:31.771869 24072 consensus_queue.cc:237] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:31.772282 24074 sys_catalog.cc:455] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: SysCatalogTable state changed. Reason: New leader 68cde708327e4da5bdb1432082ba3f7b. Latest consensus state: current_term: 1 leader_uuid: "68cde708327e4da5bdb1432082ba3f7b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } } }
I20251212 21:10:31.772452 24074 sys_catalog.cc:458] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: This master's current role is: LEADER
I20251212 21:10:31.772403 24073 sys_catalog.cc:455] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "68cde708327e4da5bdb1432082ba3f7b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } } }
I20251212 21:10:31.772631 24073 sys_catalog.cc:458] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: This master's current role is: LEADER
I20251212 21:10:31.772734 24081 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251212 21:10:31.773209 24081 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251212 21:10:31.773496 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24002
I20251212 21:10:31.773602 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal/instance
I20251212 21:10:31.775254 24081 catalog_manager.cc:1357] Generated new cluster ID: 9b84b6e4e0ca404cb9b70fc04d1c13f0
I20251212 21:10:31.775315 24081 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251212 21:10:31.790449 24081 catalog_manager.cc:1380] Generated new certificate authority record
I20251212 21:10:31.791145 24081 catalog_manager.cc:1514] Loading token signing keys...
I20251212 21:10:31.795378 24081 catalog_manager.cc:6027] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Generated new TSK 0
I20251212 21:10:31.795640 24081 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251212 21:10:31.797868 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:0
--local_ip_for_outbound_sockets=127.23.110.129
--webserver_interface=127.23.110.129
--webserver_port=0
--tserver_master_addrs=127.23.110.190:46669
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251212 21:10:31.909837 24093 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:31.910081 24093 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:31.910113 24093 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251212 21:10:31.910132 24093 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:31.912441 24093 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:31.912545 24093 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:10:31.915186 24093 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.23.110.190:46669
--never_fsync=true
--heap_profile_path=/tmp/kudu.24093
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:31.915482 24093 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:31.915796 24093 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:31.918880 24099 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:31.918884 24098 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:31.919049 24101 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:31.919068 24093 server_base.cc:1047] running on GCE node
I20251212 21:10:31.919301 24093 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:31.919554 24093 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:31.920737 24093 hybrid_clock.cc:648] HybridClock initialized: now 1765573831920710 us; error 45 us; skew 500 ppm
I20251212 21:10:31.922240 24093 webserver.cc:492] Webserver started at http://127.23.110.129:41077/ using document root <none> and password file <none>
I20251212 21:10:31.922473 24093 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:31.922530 24093 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:31.922655 24093 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:31.923822 24093 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/instance:
uuid: "f1d79b00d76349faa9c59b372f7877ba"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:31.924193 24093 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal/instance:
uuid: "f1d79b00d76349faa9c59b372f7877ba"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:31.925760 24093 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.003s	sys 0.000s
I20251212 21:10:31.926719 24107 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:31.926923 24093 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:31.927011 24093 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "f1d79b00d76349faa9c59b372f7877ba"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:31.927084 24093 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:31.945312 24093 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:31.945605 24093 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:31.945734 24093 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:31.945997 24093 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:31.946350 24093 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:31.946393 24093 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:31.946429 24093 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:31.946453 24093 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:31.953166 24093 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:41523
I20251212 21:10:31.953199 24220 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:41523 every 8 connection(s)
I20251212 21:10:31.953588 24093 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:10:31.958154 24221 heartbeater.cc:344] Connected to a master server at 127.23.110.190:46669
I20251212 21:10:31.958284 24221 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:31.958500 24221 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:31.959003 24033 ts_manager.cc:194] Registered new tserver with Master: f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523)
I20251212 21:10:31.959761 24033 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:52519
I20251212 21:10:31.963083 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24093
I20251212 21:10:31.963161 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal/instance
I20251212 21:10:31.964303 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:0
--local_ip_for_outbound_sockets=127.23.110.130
--webserver_interface=127.23.110.130
--webserver_port=0
--tserver_master_addrs=127.23.110.190:46669
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251212 21:10:32.053458 24224 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:32.053639 24224 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:32.053658 24224 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251212 21:10:32.053670 24224 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:32.055136 24224 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:32.055207 24224 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:10:32.056813 24224 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.23.110.190:46669
--never_fsync=true
--heap_profile_path=/tmp/kudu.24224
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:32.057067 24224 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:32.057334 24224 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:32.060010 24229 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:32.060022 24230 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:32.060032 24232 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:32.060289 24224 server_base.cc:1047] running on GCE node
I20251212 21:10:32.060479 24224 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:32.060660 24224 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:32.061820 24224 hybrid_clock.cc:648] HybridClock initialized: now 1765573832061805 us; error 29 us; skew 500 ppm
I20251212 21:10:32.062969 24224 webserver.cc:492] Webserver started at http://127.23.110.130:32843/ using document root <none> and password file <none>
I20251212 21:10:32.063169 24224 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:32.063210 24224 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:32.063318 24224 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:32.064205 24224 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data/instance:
uuid: "0ecacd125c104184b71910c5597cef64"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.064507 24224 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal/instance:
uuid: "0ecacd125c104184b71910c5597cef64"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.065676 24224 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.003s
I20251212 21:10:32.066490 24238 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.066781 24224 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:32.066864 24224 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "0ecacd125c104184b71910c5597cef64"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.066928 24224 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:32.083875 24224 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:32.084146 24224 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:32.084255 24224 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:32.084488 24224 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:32.084820 24224 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:32.084852 24224 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.084882 24224 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:32.084897 24224 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.090253 24224 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:35585
I20251212 21:10:32.090327 24351 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:35585 every 8 connection(s)
I20251212 21:10:32.090626 24224 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:10:32.095474 24352 heartbeater.cc:344] Connected to a master server at 127.23.110.190:46669
I20251212 21:10:32.095585 24352 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:32.095777 24352 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:32.096177 24033 ts_manager.cc:194] Registered new tserver with Master: 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.096632 24033 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:41639
I20251212 21:10:32.099301 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24224
I20251212 21:10:32.099388 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-1/wal/instance
I20251212 21:10:32.100447 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:0
--local_ip_for_outbound_sockets=127.23.110.131
--webserver_interface=127.23.110.131
--webserver_port=0
--tserver_master_addrs=127.23.110.190:46669
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251212 21:10:32.179337 24355 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:32.179538 24355 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:32.179562 24355 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251212 21:10:32.179581 24355 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:32.181041 24355 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:32.181115 24355 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:10:32.182746 24355 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.23.110.190:46669
--never_fsync=true
--heap_profile_path=/tmp/kudu.24355
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:32.182996 24355 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:32.183219 24355 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:32.185959 24363 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:32.186102 24361 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:32.186111 24360 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:32.186220 24355 server_base.cc:1047] running on GCE node
I20251212 21:10:32.186388 24355 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:32.186619 24355 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:32.187762 24355 hybrid_clock.cc:648] HybridClock initialized: now 1765573832187745 us; error 30 us; skew 500 ppm
I20251212 21:10:32.188963 24355 webserver.cc:492] Webserver started at http://127.23.110.131:36413/ using document root <none> and password file <none>
I20251212 21:10:32.189152 24355 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:32.189193 24355 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:32.189324 24355 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:32.190215 24355 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data/instance:
uuid: "2547ecb5f0e74137add24df5b3fbac48"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.190507 24355 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal/instance:
uuid: "2547ecb5f0e74137add24df5b3fbac48"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.191648 24355 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251212 21:10:32.192416 24369 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.192633 24355 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.192706 24355 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "2547ecb5f0e74137add24df5b3fbac48"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.192765 24355 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:32.209096 24355 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:32.209394 24355 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:32.209513 24355 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:32.209722 24355 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:32.210057 24355 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:32.210089 24355 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.210119 24355 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:32.210137 24355 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.215607 24355 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:35821
I20251212 21:10:32.215665 24482 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:35821 every 8 connection(s)
I20251212 21:10:32.215965 24355 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:10:32.220945 24483 heartbeater.cc:344] Connected to a master server at 127.23.110.190:46669
I20251212 21:10:32.221058 24483 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:32.221226 24483 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:32.221661 24033 ts_manager.cc:194] Registered new tserver with Master: 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:32.222016 24033 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:58987
I20251212 21:10:32.224577 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24355
I20251212 21:10:32.224648 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-2/wal/instance
I20251212 21:10:32.225891 23994 external_mini_cluster.cc:949] 3 TS(s) registered with all masters
I20251212 21:10:32.232550 23994 test_util.cc:276] Using random seed: -1323807134
I20251212 21:10:32.240108 24033 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:55968:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
  rows: "<redacted>""\004\001\000UUU\025\004\001\000\252\252\252*\004\001\000\377\377\377?\004\001\000TUUU\004\001\000\251\252\252j"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20251212 21:10:32.240402 24033 catalog_manager.cc:7016] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20251212 21:10:32.247584 24417 tablet_service.cc:1505] Processing CreateTablet for tablet a3ecd74d94aa4867aab9e37f1675cc27 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251212 21:10:32.247584 24416 tablet_service.cc:1505] Processing CreateTablet for tablet 9831c0f3d91045398ce4c2183f7db85d (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251212 21:10:32.248014 24416 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9831c0f3d91045398ce4c2183f7db85d. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.248443 24155 tablet_service.cc:1505] Processing CreateTablet for tablet a3ecd74d94aa4867aab9e37f1675cc27 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251212 21:10:32.248543 24152 tablet_service.cc:1505] Processing CreateTablet for tablet b47a8b191555448b9dd7ec692a253b73 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251212 21:10:32.248646 24152 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b47a8b191555448b9dd7ec692a253b73. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.248338 24153 tablet_service.cc:1505] Processing CreateTablet for tablet 074cc25aad644982ae7d025718ae01f5 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251212 21:10:32.248338 24154 tablet_service.cc:1505] Processing CreateTablet for tablet 9831c0f3d91045398ce4c2183f7db85d (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251212 21:10:32.248930 24153 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 074cc25aad644982ae7d025718ae01f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.249063 24155 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3ecd74d94aa4867aab9e37f1675cc27. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.249161 24154 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9831c0f3d91045398ce4c2183f7db85d. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.247584 24413 tablet_service.cc:1505] Processing CreateTablet for tablet d0f2580352f640e794cf95624f9b32d0 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251212 21:10:32.249531 24413 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d0f2580352f640e794cf95624f9b32d0. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.249723 24151 tablet_service.cc:1505] Processing CreateTablet for tablet d0f2580352f640e794cf95624f9b32d0 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251212 21:10:32.249809 24151 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d0f2580352f640e794cf95624f9b32d0. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.250545 24150 tablet_service.cc:1505] Processing CreateTablet for tablet 0ea0cdb48a6640879d462b9e52dcbcbd (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251212 21:10:32.250622 24150 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0ea0cdb48a6640879d462b9e52dcbcbd. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.247584 24415 tablet_service.cc:1505] Processing CreateTablet for tablet 074cc25aad644982ae7d025718ae01f5 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251212 21:10:32.250788 24415 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 074cc25aad644982ae7d025718ae01f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.250864 24417 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3ecd74d94aa4867aab9e37f1675cc27. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.251693 24414 tablet_service.cc:1505] Processing CreateTablet for tablet b47a8b191555448b9dd7ec692a253b73 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251212 21:10:32.251783 24414 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b47a8b191555448b9dd7ec692a253b73. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.247649 24412 tablet_service.cc:1505] Processing CreateTablet for tablet 0ea0cdb48a6640879d462b9e52dcbcbd (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251212 21:10:32.252820 24412 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0ea0cdb48a6640879d462b9e52dcbcbd. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.253075 24503 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:32.253310 24502 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: Bootstrap starting.
I20251212 21:10:32.253641 24503 tablet_bootstrap.cc:654] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.254056 24503 log.cc:826] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:32.254210 24502 tablet_bootstrap.cc:654] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.254482 24502 log.cc:826] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:32.254717 24503 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: No bootstrap required, opened a new log
I20251212 21:10:32.254855 24503 ts_tablet_manager.cc:1403] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:32.255216 24502 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: No bootstrap required, opened a new log
I20251212 21:10:32.255285 24502 ts_tablet_manager.cc:1403] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: Time spent bootstrapping tablet: real 0.002s	user 0.000s	sys 0.001s
I20251212 21:10:32.255777 24285 tablet_service.cc:1505] Processing CreateTablet for tablet 9831c0f3d91045398ce4c2183f7db85d (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 357913941 <= VALUES < 715827882
I20251212 21:10:32.255934 24281 tablet_service.cc:1505] Processing CreateTablet for tablet 0ea0cdb48a6640879d462b9e52dcbcbd (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1789569705 <= VALUES
I20251212 21:10:32.255780 24284 tablet_service.cc:1505] Processing CreateTablet for tablet 074cc25aad644982ae7d025718ae01f5 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 715827882 <= VALUES < 1073741823
I20251212 21:10:32.256151 24284 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 074cc25aad644982ae7d025718ae01f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.256788 24502 raft_consensus.cc:359] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.256940 24502 raft_consensus.cc:385] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.256970 24502 raft_consensus.cc:740] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.256983 24503 raft_consensus.cc:359] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.257099 24503 raft_consensus.cc:385] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.257126 24503 raft_consensus.cc:740] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.257112 24502 consensus_queue.cc:260] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.257269 24503 consensus_queue.cc:260] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.257400 24502 ts_tablet_manager.cc:1434] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.002s
I20251212 21:10:32.257474 24483 heartbeater.cc:499] Master 127.23.110.190:46669 was elected leader, sending a full tablet report...
I20251212 21:10:32.257675 24503 ts_tablet_manager.cc:1434] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.003s	user 0.002s	sys 0.000s
I20251212 21:10:32.257759 24503 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:32.257825 24221 heartbeater.cc:499] Master 127.23.110.190:46669 was elected leader, sending a full tablet report...
I20251212 21:10:32.258167 24503 tablet_bootstrap.cc:654] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.258533 24508 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: Bootstrap starting.
I20251212 21:10:32.258702 24503 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: No bootstrap required, opened a new log
I20251212 21:10:32.258740 24503 ts_tablet_manager.cc:1403] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.258872 24503 raft_consensus.cc:359] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.258929 24503 raft_consensus.cc:385] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.258945 24503 raft_consensus.cc:740] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.258980 24503 consensus_queue.cc:260] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.259058 24503 ts_tablet_manager.cc:1434] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.259109 24503 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:32.255827 24283 tablet_service.cc:1505] Processing CreateTablet for tablet b47a8b191555448b9dd7ec692a253b73 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1073741823 <= VALUES < 1431655764
I20251212 21:10:32.259193 24283 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b47a8b191555448b9dd7ec692a253b73. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.259253 24508 tablet_bootstrap.cc:654] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.259367 24502 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48: Bootstrap starting.
I20251212 21:10:32.259495 24503 tablet_bootstrap.cc:654] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.259505 24508 log.cc:826] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:32.259806 24502 tablet_bootstrap.cc:654] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.260079 24503 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: No bootstrap required, opened a new log
I20251212 21:10:32.260126 24503 ts_tablet_manager.cc:1403] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.260272 24503 raft_consensus.cc:359] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.260320 24503 raft_consensus.cc:385] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.260337 24503 raft_consensus.cc:740] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.260345 24502 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48: No bootstrap required, opened a new log
I20251212 21:10:32.260377 24502 ts_tablet_manager.cc:1403] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:32.260380 24503 consensus_queue.cc:260] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.255865 24282 tablet_service.cc:1505] Processing CreateTablet for tablet d0f2580352f640e794cf95624f9b32d0 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION 1431655764 <= VALUES < 1789569705
I20251212 21:10:32.260517 24502 raft_consensus.cc:359] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.260571 24502 raft_consensus.cc:385] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.260562 24282 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d0f2580352f640e794cf95624f9b32d0. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.260586 24502 raft_consensus.cc:740] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.260637 24502 consensus_queue.cc:260] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.260680 24503 ts_tablet_manager.cc:1434] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:32.260717 24502 ts_tablet_manager.cc:1434] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.260735 24503 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:32.260775 24502 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48: Bootstrap starting.
I20251212 21:10:32.261132 24503 tablet_bootstrap.cc:654] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.261157 24502 tablet_bootstrap.cc:654] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.261704 24503 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: No bootstrap required, opened a new log
I20251212 21:10:32.261746 24503 ts_tablet_manager.cc:1403] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.261883 24503 raft_consensus.cc:359] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.261947 24503 raft_consensus.cc:385] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.261948 24502 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48: No bootstrap required, opened a new log
I20251212 21:10:32.261967 24503 raft_consensus.cc:740] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.261981 24502 ts_tablet_manager.cc:1403] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48: Time spent bootstrapping tablet: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:32.262010 24503 consensus_queue.cc:260] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.262097 24503 ts_tablet_manager.cc:1434] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.262156 24503 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:32.262251 24502 raft_consensus.cc:359] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.262319 24502 raft_consensus.cc:385] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.255777 24286 tablet_service.cc:1505] Processing CreateTablet for tablet a3ecd74d94aa4867aab9e37f1675cc27 (DEFAULT_TABLE table=test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]), partition=RANGE (key) PARTITION VALUES < 357913941
I20251212 21:10:32.262519 24286 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3ecd74d94aa4867aab9e37f1675cc27. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.262535 24503 tablet_bootstrap.cc:654] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.262338 24502 raft_consensus.cc:740] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.262692 24502 consensus_queue.cc:260] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.262825 24502 ts_tablet_manager.cc:1434] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:32.262905 24502 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48: Bootstrap starting.
I20251212 21:10:32.263381 24503 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: No bootstrap required, opened a new log
I20251212 21:10:32.263432 24503 ts_tablet_manager.cc:1403] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.263581 24503 raft_consensus.cc:359] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.263643 24503 raft_consensus.cc:385] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.263660 24503 raft_consensus.cc:740] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.263696 24503 consensus_queue.cc:260] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.263706 24285 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9831c0f3d91045398ce4c2183f7db85d. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.263783 24502 tablet_bootstrap.cc:654] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.263788 24503 ts_tablet_manager.cc:1434] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.263856 24503 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:32.264230 24503 tablet_bootstrap.cc:654] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.264653 24281 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0ea0cdb48a6640879d462b9e52dcbcbd. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:32.264767 24502 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48: No bootstrap required, opened a new log
I20251212 21:10:32.264806 24502 ts_tablet_manager.cc:1403] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48: Time spent bootstrapping tablet: real 0.002s	user 0.000s	sys 0.001s
I20251212 21:10:32.264925 24502 raft_consensus.cc:359] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.264963 24502 raft_consensus.cc:385] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.264974 24502 raft_consensus.cc:740] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.265024 24502 consensus_queue.cc:260] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.265103 24502 ts_tablet_manager.cc:1434] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.265173 24502 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48: Bootstrap starting.
I20251212 21:10:32.265424 24503 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: No bootstrap required, opened a new log
I20251212 21:10:32.265461 24503 ts_tablet_manager.cc:1403] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.000s
I20251212 21:10:32.265583 24503 raft_consensus.cc:359] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.265614 24502 tablet_bootstrap.cc:654] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.265633 24503 raft_consensus.cc:385] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.265654 24503 raft_consensus.cc:740] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.265709 24503 consensus_queue.cc:260] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.265790 24503 ts_tablet_manager.cc:1434] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.266446 24508 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: No bootstrap required, opened a new log
I20251212 21:10:32.266454 24502 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48: No bootstrap required, opened a new log
I20251212 21:10:32.266491 24502 ts_tablet_manager.cc:1403] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.266510 24508 ts_tablet_manager.cc:1403] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: Time spent bootstrapping tablet: real 0.008s	user 0.001s	sys 0.000s
I20251212 21:10:32.266595 24502 raft_consensus.cc:359] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.266640 24502 raft_consensus.cc:385] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.266652 24502 raft_consensus.cc:740] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.266690 24502 consensus_queue.cc:260] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.266784 24502 ts_tablet_manager.cc:1434] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.266839 24502 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48: Bootstrap starting.
I20251212 21:10:32.267262 24502 tablet_bootstrap.cc:654] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.267962 24502 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48: No bootstrap required, opened a new log
I20251212 21:10:32.268000 24502 ts_tablet_manager.cc:1403] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.267922 24508 raft_consensus.cc:359] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.268067 24508 raft_consensus.cc:385] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.268096 24508 raft_consensus.cc:740] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0ecacd125c104184b71910c5597cef64, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.268117 24502 raft_consensus.cc:359] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.268167 24502 raft_consensus.cc:385] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.268187 24502 raft_consensus.cc:740] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.268182 24508 consensus_queue.cc:260] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.268261 24502 consensus_queue.cc:260] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.268381 24502 ts_tablet_manager.cc:1434] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.268445 24508 ts_tablet_manager.cc:1434] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:32.268553 24508 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64: Bootstrap starting.
I20251212 21:10:32.268970 24508 tablet_bootstrap.cc:654] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.269332 24352 heartbeater.cc:499] Master 127.23.110.190:46669 was elected leader, sending a full tablet report...
I20251212 21:10:32.269358 24506 raft_consensus.cc:493] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:32.269431 24506 raft_consensus.cc:515] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.269552 24508 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64: No bootstrap required, opened a new log
I20251212 21:10:32.269587 24508 ts_tablet_manager.cc:1403] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.269742 24506 leader_election.cc:290] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.269724 24508 raft_consensus.cc:359] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.269781 24508 raft_consensus.cc:385] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.269811 24508 raft_consensus.cc:740] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0ecacd125c104184b71910c5597cef64, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.269883 24508 consensus_queue.cc:260] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.269976 24508 ts_tablet_manager.cc:1434] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.270079 24508 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64: Bootstrap starting.
I20251212 21:10:32.270506 24508 tablet_bootstrap.cc:654] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.271337 24508 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64: No bootstrap required, opened a new log
I20251212 21:10:32.271390 24508 ts_tablet_manager.cc:1403] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.271536 24508 raft_consensus.cc:359] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.271608 24508 raft_consensus.cc:385] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.271627 24508 raft_consensus.cc:740] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0ecacd125c104184b71910c5597cef64, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.271667 24508 consensus_queue.cc:260] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.271776 24508 ts_tablet_manager.cc:1434] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64: Time spent starting tablet: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:10:32.271847 24508 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64: Bootstrap starting.
I20251212 21:10:32.272279 24508 tablet_bootstrap.cc:654] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.273025 24508 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64: No bootstrap required, opened a new log
I20251212 21:10:32.273069 24508 ts_tablet_manager.cc:1403] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.273209 24508 raft_consensus.cc:359] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.273294 24508 raft_consensus.cc:385] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.273321 24508 raft_consensus.cc:740] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0ecacd125c104184b71910c5597cef64, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.273368 24508 consensus_queue.cc:260] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.273448 24508 ts_tablet_manager.cc:1434] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64: Time spent starting tablet: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:10:32.273506 24508 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64: Bootstrap starting.
I20251212 21:10:32.273937 24508 tablet_bootstrap.cc:654] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.274102 24173 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a3ecd74d94aa4867aab9e37f1675cc27" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba" is_pre_election: true
I20251212 21:10:32.274140 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a3ecd74d94aa4867aab9e37f1675cc27" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.274267 24306 raft_consensus.cc:2468] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.274267 24173 raft_consensus.cc:2468] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.274483 24373 leader_election.cc:304] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.274670 24506 raft_consensus.cc:2804] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:32.274706 24508 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64: No bootstrap required, opened a new log
I20251212 21:10:32.274732 24506 raft_consensus.cc:493] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:32.274761 24506 raft_consensus.cc:3060] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.274775 24508 ts_tablet_manager.cc:1403] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.274914 24508 raft_consensus.cc:359] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.274951 24508 raft_consensus.cc:385] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.274962 24508 raft_consensus.cc:740] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0ecacd125c104184b71910c5597cef64, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.274994 24508 consensus_queue.cc:260] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.275075 24508 ts_tablet_manager.cc:1434] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.275120 24508 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64: Bootstrap starting.
I20251212 21:10:32.275625 24508 tablet_bootstrap.cc:654] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:32.275646 24506 raft_consensus.cc:515] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.275794 24506 leader_election.cc:290] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Requested vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.275962 24173 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a3ecd74d94aa4867aab9e37f1675cc27" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba"
I20251212 21:10:32.276024 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "a3ecd74d94aa4867aab9e37f1675cc27" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.276041 24173 raft_consensus.cc:3060] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.276083 24306 raft_consensus.cc:3060] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.276170 24508 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64: No bootstrap required, opened a new log
I20251212 21:10:32.276201 24508 ts_tablet_manager.cc:1403] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64: Time spent bootstrapping tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.276319 24508 raft_consensus.cc:359] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.276362 24508 raft_consensus.cc:385] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:32.276378 24508 raft_consensus.cc:740] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0ecacd125c104184b71910c5597cef64, State: Initialized, Role: FOLLOWER
I20251212 21:10:32.276422 24508 consensus_queue.cc:260] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.276516 24508 ts_tablet_manager.cc:1434] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64: Time spent starting tablet: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.276806 24173 raft_consensus.cc:2468] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.276907 24306 raft_consensus.cc:2468] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.276980 24373 leader_election.cc:304] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.277093 24506 raft_consensus.cc:2804] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:32.277300 24506 raft_consensus.cc:697] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Becoming Leader. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Running, Role: LEADER
I20251212 21:10:32.277397 24506 consensus_queue.cc:237] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:32.278007 24513 raft_consensus.cc:493] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:32.278064 24513 raft_consensus.cc:515] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.278122 24032 catalog_manager.cc:5654] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131). New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:32.278318 24513 leader_election.cc:290] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585), 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:32.281451 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.281544 24306 raft_consensus.cc:2468] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f1d79b00d76349faa9c59b372f7877ba in term 0.
I20251212 21:10:32.281661 24437 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "2547ecb5f0e74137add24df5b3fbac48" is_pre_election: true
I20251212 21:10:32.281709 24110 leader_election.cc:304] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.281778 24437 raft_consensus.cc:2468] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate f1d79b00d76349faa9c59b372f7877ba in term 0.
I20251212 21:10:32.281893 24513 raft_consensus.cc:2804] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:32.281955 24513 raft_consensus.cc:493] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:32.281988 24513 raft_consensus.cc:3060] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.282539 24513 raft_consensus.cc:515] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.282680 24513 leader_election.cc:290] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 1 election: Requested vote from peers 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585), 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:32.282909 24437 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "2547ecb5f0e74137add24df5b3fbac48"
I20251212 21:10:32.282907 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.282990 24306 raft_consensus.cc:3060] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.282990 24437 raft_consensus.cc:3060] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.283708 24306 raft_consensus.cc:2468] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f1d79b00d76349faa9c59b372f7877ba in term 1.
I20251212 21:10:32.283727 24437 raft_consensus.cc:2468] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate f1d79b00d76349faa9c59b372f7877ba in term 1.
I20251212 21:10:32.283905 24111 leader_election.cc:304] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.284016 24513 raft_consensus.cc:2804] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:32.284070 24513 raft_consensus.cc:697] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 1 LEADER]: Becoming Leader. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Running, Role: LEADER
I20251212 21:10:32.284199 24513 consensus_queue.cc:237] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.285090 24032 catalog_manager.cc:5654] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba reported cstate change: term changed from 0 to 1, leader changed from <none> to f1d79b00d76349faa9c59b372f7877ba (127.23.110.129). New cstate: current_term: 1 leader_uuid: "f1d79b00d76349faa9c59b372f7877ba" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:32.289906 24506 raft_consensus.cc:493] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:32.289991 24506 raft_consensus.cc:515] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.290155 24506 leader_election.cc:290] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.290357 24173 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "b47a8b191555448b9dd7ec692a253b73" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba" is_pre_election: true
I20251212 21:10:32.290385 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "b47a8b191555448b9dd7ec692a253b73" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.290501 24173 raft_consensus.cc:2468] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.290500 24306 raft_consensus.cc:2468] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.290654 24373 leader_election.cc:304] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.290746 24506 raft_consensus.cc:2804] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:32.290776 24506 raft_consensus.cc:493] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:32.290792 24506 raft_consensus.cc:3060] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.291333 24506 raft_consensus.cc:515] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.291500 24506 leader_election.cc:290] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Requested vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.291625 24173 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "b47a8b191555448b9dd7ec692a253b73" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba"
I20251212 21:10:32.291703 24173 raft_consensus.cc:3060] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.291716 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "b47a8b191555448b9dd7ec692a253b73" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.291783 24306 raft_consensus.cc:3060] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.292362 24173 raft_consensus.cc:2468] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.292362 24306 raft_consensus.cc:2468] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.292555 24372 leader_election.cc:304] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48; no voters: 
I20251212 21:10:32.292649 24506 raft_consensus.cc:2804] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:32.292727 24506 raft_consensus.cc:697] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Becoming Leader. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Running, Role: LEADER
I20251212 21:10:32.292779 24506 consensus_queue.cc:237] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.293354 24032 catalog_manager.cc:5654] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131). New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } }
I20251212 21:10:32.304874 24506 raft_consensus.cc:493] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:32.304929 24528 raft_consensus.cc:493] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:32.304972 24506 raft_consensus.cc:515] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.305006 24528 raft_consensus.cc:515] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.305099 24506 leader_election.cc:290] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.305125 24528 leader_election.cc:290] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.305337 24173 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9831c0f3d91045398ce4c2183f7db85d" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba" is_pre_election: true
I20251212 21:10:32.305361 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9831c0f3d91045398ce4c2183f7db85d" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.305397 24305 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "0ea0cdb48a6640879d462b9e52dcbcbd" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.305419 24175 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "0ea0cdb48a6640879d462b9e52dcbcbd" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba" is_pre_election: true
I20251212 21:10:32.305454 24306 raft_consensus.cc:2468] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.305471 24173 raft_consensus.cc:2468] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.305461 24305 raft_consensus.cc:2468] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.305481 24175 raft_consensus.cc:2468] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.305637 24373 leader_election.cc:304] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.305727 24528 raft_consensus.cc:2804] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:32.305718 24373 leader_election.cc:304] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.305790 24528 raft_consensus.cc:493] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:32.305837 24528 raft_consensus.cc:3060] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.305845 24506 raft_consensus.cc:2804] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:32.305888 24506 raft_consensus.cc:493] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:32.305912 24506 raft_consensus.cc:3060] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.306567 24528 raft_consensus.cc:515] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.306627 24506 raft_consensus.cc:515] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.306702 24506 leader_election.cc:290] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Requested vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.306886 24528 leader_election.cc:290] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Requested vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.306872 24175 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9831c0f3d91045398ce4c2183f7db85d" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba"
I20251212 21:10:32.306910 24306 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "0ea0cdb48a6640879d462b9e52dcbcbd" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.306967 24306 raft_consensus.cc:3060] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.306878 24173 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "0ea0cdb48a6640879d462b9e52dcbcbd" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba"
I20251212 21:10:32.306949 24175 raft_consensus.cc:3060] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.307011 24173 raft_consensus.cc:3060] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.307009 24305 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9831c0f3d91045398ce4c2183f7db85d" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.307060 24305 raft_consensus.cc:3060] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.307614 24173 raft_consensus.cc:2468] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.307614 24306 raft_consensus.cc:2468] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.307754 24175 raft_consensus.cc:2468] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.307754 24305 raft_consensus.cc:2468] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.307766 24373 leader_election.cc:304] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba; no voters: 
I20251212 21:10:32.307864 24528 raft_consensus.cc:2804] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:32.307904 24372 leader_election.cc:304] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48; no voters: 
I20251212 21:10:32.307916 24528 raft_consensus.cc:697] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Becoming Leader. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Running, Role: LEADER
I20251212 21:10:32.307961 24528 consensus_queue.cc:237] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.308017 24506 raft_consensus.cc:2804] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:32.308071 24506 raft_consensus.cc:697] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Becoming Leader. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Running, Role: LEADER
I20251212 21:10:32.308110 24506 consensus_queue.cc:237] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.308528 24032 catalog_manager.cc:5654] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131). New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } }
I20251212 21:10:32.309589 24032 catalog_manager.cc:5654] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131). New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } }
I20251212 21:10:32.324445 24506 raft_consensus.cc:493] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:32.324600 24506 raft_consensus.cc:515] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.324769 24506 leader_election.cc:290] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.325004 24305 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "d0f2580352f640e794cf95624f9b32d0" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.325022 24175 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "d0f2580352f640e794cf95624f9b32d0" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba" is_pre_election: true
I20251212 21:10:32.325085 24305 raft_consensus.cc:2468] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.325119 24175 raft_consensus.cc:2468] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 0.
I20251212 21:10:32.325270 24372 leader_election.cc:304] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48; no voters: 
I20251212 21:10:32.325388 24506 raft_consensus.cc:2804] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:32.325438 24506 raft_consensus.cc:493] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:32.325451 24506 raft_consensus.cc:3060] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.326030 24506 raft_consensus.cc:515] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.326153 24506 leader_election.cc:290] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Requested vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.326349 24305 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "d0f2580352f640e794cf95624f9b32d0" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.326346 24175 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "d0f2580352f640e794cf95624f9b32d0" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f1d79b00d76349faa9c59b372f7877ba"
I20251212 21:10:32.326418 24305 raft_consensus.cc:3060] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.326418 24175 raft_consensus.cc:3060] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:32.326992 24305 raft_consensus.cc:2468] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.326992 24175 raft_consensus.cc:2468] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.327190 24372 leader_election.cc:304] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48; no voters: 
I20251212 21:10:32.327293 24506 raft_consensus.cc:2804] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:32.327343 24506 raft_consensus.cc:697] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Becoming Leader. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Running, Role: LEADER
I20251212 21:10:32.327392 24506 consensus_queue.cc:237] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.327994 24032 catalog_manager.cc:5654] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131). New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } }
I20251212 21:10:32.338611 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:0
--local_ip_for_outbound_sockets=127.23.110.132
--webserver_interface=127.23.110.132
--webserver_port=0
--tserver_master_addrs=127.23.110.190:46669
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251212 21:10:32.341102 24353 tablet.cc:2378] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20251212 21:10:32.341269 24353 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
I20251212 21:10:32.350194 24437 raft_consensus.cc:1275] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Refusing update from remote peer f1d79b00d76349faa9c59b372f7877ba: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.350512 24513 consensus_queue.cc:1048] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [LEADER]: Connected to new peer: Peer: permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.350853 24305 raft_consensus.cc:1275] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer f1d79b00d76349faa9c59b372f7877ba: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.351028 24306 raft_consensus.cc:1275] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.351291 24513 consensus_queue.cc:1048] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.351296 24506 consensus_queue.cc:1048] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.351560 24306 raft_consensus.cc:1275] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.351929 24175 raft_consensus.cc:1275] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.351934 24528 consensus_queue.cc:1048] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.352072 24173 raft_consensus.cc:1275] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.352305 24528 consensus_queue.cc:1048] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.352422 24528 consensus_queue.cc:1048] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.353057 24174 raft_consensus.cc:1275] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.353147 24303 raft_consensus.cc:1275] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.353261 24171 raft_consensus.cc:1275] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.353286 24302 raft_consensus.cc:1275] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.353484 24301 raft_consensus.cc:1275] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.353716 24171 raft_consensus.cc:1275] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:32.353981 24506 consensus_queue.cc:1048] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.354100 24528 consensus_queue.cc:1048] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.354106 24506 consensus_queue.cc:1048] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.354357 24533 consensus_queue.cc:1048] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.354477 24533 consensus_queue.cc:1048] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.354550 24533 consensus_queue.cc:1048] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:32.356297 24546 mvcc.cc:204] Tried to move back new op lower bound from 7231790417308188672 to 7231790417008939008. Current Snapshot: MvccSnapshot[applied={T|T < 7231790417308188672}]
I20251212 21:10:32.357637 24544 mvcc.cc:204] Tried to move back new op lower bound from 7231790417308188672 to 7231790417008939008. Current Snapshot: MvccSnapshot[applied={T|T < 7231790417308188672}]
I20251212 21:10:32.359413 24554 mvcc.cc:204] Tried to move back new op lower bound from 7231790417308188672 to 7231790417008939008. Current Snapshot: MvccSnapshot[applied={T|T < 7231790417308188672}]
W20251212 21:10:32.454485 24222 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251212 21:10:32.466725 24484 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251212 21:10:32.487543 24541 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:32.487802 24541 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:32.487833 24541 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251212 21:10:32.487855 24541 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:32.490330 24541 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:32.490423 24541 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:10:32.493039 24541 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=0
--enable_log_gc=false
--tserver_master_addrs=127.23.110.190:46669
--never_fsync=true
--heap_profile_path=/tmp/kudu.24541
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:32.493371 24541 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:32.493659 24541 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:32.497833 24604 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:32.498090 24606 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:32.498152 24603 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:32.499042 24541 server_base.cc:1047] running on GCE node
I20251212 21:10:32.499331 24541 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:32.499603 24541 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:32.507191 24541 hybrid_clock.cc:648] HybridClock initialized: now 1765573832507158 us; error 40 us; skew 500 ppm
I20251212 21:10:32.509225 24541 webserver.cc:492] Webserver started at http://127.23.110.132:40875/ using document root <none> and password file <none>
I20251212 21:10:32.509521 24541 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:32.509585 24541 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:32.509712 24541 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:32.510799 24541 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data/instance:
uuid: "d4ac390a2b804f15815d2ba3f4919d61"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.511162 24541 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal/instance:
uuid: "d4ac390a2b804f15815d2ba3f4919d61"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.521878 24541 fs_manager.cc:696] Time spent creating directory manager: real 0.011s	user 0.000s	sys 0.003s
I20251212 21:10:32.523330 24612 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.523531 24541 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:32.523594 24541 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "d4ac390a2b804f15815d2ba3f4919d61"
format_stamp: "Formatted at 2025-12-12 21:10:32 on dist-test-slave-rz82"
I20251212 21:10:32.523664 24541 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:32.536366 24541 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:32.536763 24541 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:32.536971 24541 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:32.537266 24541 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:32.537640 24541 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:32.537765 24541 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.537837 24541 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:32.537860 24541 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:32.545058 24541 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:40853
I20251212 21:10:32.545519 24541 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:10:32.555080 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24541
I20251212 21:10:32.555162 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-3/wal/instance
I20251212 21:10:32.556068 24726 heartbeater.cc:344] Connected to a master server at 127.23.110.190:46669
I20251212 21:10:32.556185 24726 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:32.556452 24726 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:32.556699 24725 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:40853 every 8 connection(s)
I20251212 21:10:32.556876 24033 ts_manager.cc:194] Registered new tserver with Master: d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853)
I20251212 21:10:32.557402 24033 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:55875
I20251212 21:10:32.632407 24033 ts_manager.cc:295] Set tserver state for f1d79b00d76349faa9c59b372f7877ba to MAINTENANCE_MODE
I20251212 21:10:32.632913 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24093
W20251212 21:10:32.641520 24372 connection.cc:537] server connection from 127.23.110.129:56971 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:32.641511 24241 connection.cc:537] server connection from 127.23.110.129:33761 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:32.641670 24373 connection.cc:537] client connection to 127.23.110.129:41523 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:32.641744 24373 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:32.641798 24494 connection.cc:537] client connection to 127.23.110.129:41523 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:32.641916 24494 meta_cache.cc:302] tablet 074cc25aad644982ae7d025718ae01f5: replica f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:32.642238 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:32.642280 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:32.642293 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:32.642306 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:32.642324 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:32.646026 24266 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.646694 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.650400 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.653594 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.663199 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.664322 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.672235 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.673142 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.689002 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.695134 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.698822 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.706046 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.725646 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.727777 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.737619 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.741725 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.771049 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.786976 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.789744 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.806650 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.823863 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.842613 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.842713 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.865394 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.887874 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.912822 24266 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
W20251212 21:10:32.912822 24265 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46230: Illegal state: replica 0ecacd125c104184b71910c5597cef64 is not leader of this config: current role FOLLOWER
I20251212 21:10:32.932477 24732 raft_consensus.cc:493] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader f1d79b00d76349faa9c59b372f7877ba)
I20251212 21:10:32.932593 24732 raft_consensus.cc:515] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.932900 24732 leader_election.cc:290] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:32.933100 24598 raft_consensus.cc:493] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader f1d79b00d76349faa9c59b372f7877ba)
I20251212 21:10:32.933207 24598 raft_consensus.cc:515] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.933372 24598 leader_election.cc:290] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
W20251212 21:10:32.933497 24242 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111)
I20251212 21:10:32.933732 24302 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 2 candidate_status { last_received { term: 1 index: 98 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:32.933811 24302 raft_consensus.cc:2468] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 1.
I20251212 21:10:32.934033 24372 leader_election.cc:304] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48; no voters: 
W20251212 21:10:32.934042 24242 leader_election.cc:336] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111)
I20251212 21:10:32.934113 24598 raft_consensus.cc:2804] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Leader pre-election won for term 2
W20251212 21:10:32.934118 24373 leader_election.cc:336] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111)
I20251212 21:10:32.934152 24598 raft_consensus.cc:493] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Starting leader election (detected failure of leader f1d79b00d76349faa9c59b372f7877ba)
I20251212 21:10:32.934191 24598 raft_consensus.cc:3060] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 FOLLOWER]: Advancing to term 2
I20251212 21:10:32.935038 24598 raft_consensus.cc:515] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.935174 24598 leader_election.cc:290] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 2 election: Requested vote from peers f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523), 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:32.935416 24302 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "2547ecb5f0e74137add24df5b3fbac48" candidate_term: 2 candidate_status { last_received { term: 1 index: 98 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64"
I20251212 21:10:32.935493 24302 raft_consensus.cc:3060] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Advancing to term 2
W20251212 21:10:32.935662 24387 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37108: Illegal state: replica 2547ecb5f0e74137add24df5b3fbac48 is not leader of this config: current role FOLLOWER
I20251212 21:10:32.936298 24302 raft_consensus.cc:2468] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2547ecb5f0e74137add24df5b3fbac48 in term 2.
I20251212 21:10:32.936465 24372 leader_election.cc:304] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48; no voters: 
I20251212 21:10:32.936548 24598 raft_consensus.cc:2804] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 2 FOLLOWER]: Leader election won for term 2
I20251212 21:10:32.936587 24598 raft_consensus.cc:697] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 2 LEADER]: Becoming Leader. State: Replica: 2547ecb5f0e74137add24df5b3fbac48, State: Running, Role: LEADER
I20251212 21:10:32.936630 24598 consensus_queue.cc:237] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 96, Committed index: 96, Last appended: 1.98, Last appended by leader: 98, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:32.936959 24437 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "074cc25aad644982ae7d025718ae01f5" candidate_uuid: "0ecacd125c104184b71910c5597cef64" candidate_term: 2 candidate_status { last_received { term: 1 index: 98 } } ignore_live_leader: false dest_uuid: "2547ecb5f0e74137add24df5b3fbac48" is_pre_election: true
W20251212 21:10:32.937073 24373 leader_election.cc:336] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111)
I20251212 21:10:32.937165 24242 leader_election.cc:304] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 0ecacd125c104184b71910c5597cef64; no voters: 2547ecb5f0e74137add24df5b3fbac48, f1d79b00d76349faa9c59b372f7877ba
I20251212 21:10:32.937290 24732 raft_consensus.cc:2749] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 2 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251212 21:10:32.937187 24033 catalog_manager.cc:5654] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: term changed from 1 to 2, leader changed from f1d79b00d76349faa9c59b372f7877ba (127.23.110.129) to 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131). New cstate: current_term: 2 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } }
I20251212 21:10:32.961853 24302 raft_consensus.cc:1275] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 2 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 1 index: 98. Preceding OpId from leader: term: 2 index: 100. (index mismatch)
I20251212 21:10:32.962093 24598 consensus_queue.cc:1048] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 99, Last known committed idx: 96, Time since last communication: 0.000s
W20251212 21:10:32.962224 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:33.100054 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:33.114127 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:33.138218 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:33.148183 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:33.148576 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:33.510288 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20251212 21:10:33.567067 24726 heartbeater.cc:499] Master 127.23.110.190:46669 was elected leader, sending a full tablet report...
W20251212 21:10:33.605406 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:33.608304 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:33.622010 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:33.631280 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:33.634344 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:34.044934 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:34.101073 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:34.155135 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:34.160910 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:34.161278 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:34.176996 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:34.536227 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
I20251212 21:10:34.637574 24747 consensus_queue.cc:579] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.005s)
W20251212 21:10:34.644093 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:34.656208 24749 consensus_queue.cc:579] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.025s)
W20251212 21:10:34.662374 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:34.671315 24749 consensus_queue.cc:579] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.038s)
W20251212 21:10:34.673653 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:34.712343 24749 consensus_queue.cc:579] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.081s)
W20251212 21:10:34.715196 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:34.730257 24597 consensus_queue.cc:579] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.096s)
W20251212 21:10:34.734280 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:35.000289 24740 consensus_queue.cc:579] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.064s)
W20251212 21:10:35.003974 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251212 21:10:35.141799 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:35.215687 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:35.233125 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:35.233290 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:35.257704 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:35.533721 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:35.623728 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251212 21:10:35.672806 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24002
W20251212 21:10:35.673921 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251212 21:10:35.681072 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.23.110.190:46669
--webserver_interface=127.23.110.190
--webserver_port=43745
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.23.110.190:46669 with env {}
W20251212 21:10:35.694653 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251212 21:10:35.732743 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251212 21:10:35.800238 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251212 21:10:35.834507 24754 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:35.834898 24754 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:35.834987 24754 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:35.837086 24754 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251212 21:10:35.837255 24754 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:35.837324 24754 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251212 21:10:35.837390 24754 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251212 21:10:35.839339 24754 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.23.110.190:46669
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.23.110.190:46669
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.23.110.190
--webserver_port=43745
--never_fsync=true
--heap_profile_path=/tmp/kudu.24754
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:35.839628 24754 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:35.839880 24754 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:35.842880 24760 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:35.842880 24759 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:35.850587 24762 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:35.851516 24754 server_base.cc:1047] running on GCE node
I20251212 21:10:35.851997 24754 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:35.852274 24754 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:35.859841 24754 hybrid_clock.cc:648] HybridClock initialized: now 1765573835859773 us; error 71 us; skew 500 ppm
I20251212 21:10:35.861887 24754 webserver.cc:492] Webserver started at http://127.23.110.190:43745/ using document root <none> and password file <none>
I20251212 21:10:35.862104 24754 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:35.862160 24754 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:35.863544 24754 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:35.864253 24768 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:35.864467 24754 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:35.864543 24754 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
uuid: "68cde708327e4da5bdb1432082ba3f7b"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:35.864836 24754 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:35.908932 24754 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:35.909416 24754 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:35.909754 24754 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:35.915508 24754 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.190:46669
I20251212 21:10:35.915524 24820 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.190:46669 every 8 connection(s)
I20251212 21:10:35.915982 24754 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
I20251212 21:10:35.917579 24821 sys_catalog.cc:263] Verifying existing consensus state
I20251212 21:10:35.918432 24821 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Bootstrap starting.
I20251212 21:10:35.920719 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24754
I20251212 21:10:35.927762 24821 log.cc:826] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:35.930508 24821 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Bootstrap replayed 1/1 log segments. Stats: ops{read=14 overwritten=0 applied=14 ignored=0} inserts{seen=11 ignored=0} mutations{seen=13 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:35.930925 24821 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Bootstrap complete.
I20251212 21:10:35.933704 24821 raft_consensus.cc:359] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:35.934108 24821 raft_consensus.cc:740] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 68cde708327e4da5bdb1432082ba3f7b, State: Initialized, Role: FOLLOWER
I20251212 21:10:35.934341 24821 consensus_queue.cc:260] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14, Last appended: 1.14, Last appended by leader: 14, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:35.934486 24821 raft_consensus.cc:399] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251212 21:10:35.934587 24821 raft_consensus.cc:493] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251212 21:10:35.934679 24821 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 1 FOLLOWER]: Advancing to term 2
I20251212 21:10:35.935848 24821 raft_consensus.cc:515] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:35.936033 24821 leader_election.cc:304] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 68cde708327e4da5bdb1432082ba3f7b; no voters: 
I20251212 21:10:35.936259 24821 leader_election.cc:290] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [CANDIDATE]: Term 2 election: Requested vote from peers 
I20251212 21:10:35.936581 24821 sys_catalog.cc:565] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: configured and running, proceeding with master startup.
I20251212 21:10:35.938864 24836 catalog_manager.cc:1269] Loaded cluster ID: 9b84b6e4e0ca404cb9b70fc04d1c13f0
I20251212 21:10:35.938967 24836 catalog_manager.cc:1562] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: loading cluster ID for follower catalog manager: success
I20251212 21:10:35.940255 24836 catalog_manager.cc:1584] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: acquiring CA information for follower catalog manager: success
I20251212 21:10:35.940671 24836 catalog_manager.cc:1612] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: importing token verification keys for follower catalog manager: success; most recent TSK sequence number 0
I20251212 21:10:35.940908 24825 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 2 FOLLOWER]: Leader election won for term 2
I20251212 21:10:35.941074 24825 raft_consensus.cc:697] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [term 2 LEADER]: Becoming Leader. State: Replica: 68cde708327e4da5bdb1432082ba3f7b, State: Running, Role: LEADER
I20251212 21:10:35.941208 24825 consensus_queue.cc:237] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14, Committed index: 14, Last appended: 1.14, Last appended by leader: 14, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } }
I20251212 21:10:35.941533 24825 sys_catalog.cc:455] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "68cde708327e4da5bdb1432082ba3f7b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } } }
I20251212 21:10:35.941648 24825 sys_catalog.cc:458] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: This master's current role is: LEADER
I20251212 21:10:35.941792 24825 sys_catalog.cc:455] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: SysCatalogTable state changed. Reason: New leader 68cde708327e4da5bdb1432082ba3f7b. Latest consensus state: current_term: 2 leader_uuid: "68cde708327e4da5bdb1432082ba3f7b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "68cde708327e4da5bdb1432082ba3f7b" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 46669 } } }
I20251212 21:10:35.945273 24825 sys_catalog.cc:458] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b [sys.catalog]: This master's current role is: LEADER
I20251212 21:10:35.945484 24840 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251212 21:10:35.945833 24840 catalog_manager.cc:679] Loaded metadata for table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6]
I20251212 21:10:35.948846 24840 tablet_loader.cc:96] loaded metadata for tablet 074cc25aad644982ae7d025718ae01f5 (table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6])
I20251212 21:10:35.949071 24840 tablet_loader.cc:96] loaded metadata for tablet 0ea0cdb48a6640879d462b9e52dcbcbd (table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6])
I20251212 21:10:35.949164 24840 tablet_loader.cc:96] loaded metadata for tablet 9831c0f3d91045398ce4c2183f7db85d (table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6])
I20251212 21:10:35.949223 24840 tablet_loader.cc:96] loaded metadata for tablet a3ecd74d94aa4867aab9e37f1675cc27 (table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6])
I20251212 21:10:35.949340 24840 tablet_loader.cc:96] loaded metadata for tablet b47a8b191555448b9dd7ec692a253b73 (table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6])
I20251212 21:10:35.949399 24840 tablet_loader.cc:96] loaded metadata for tablet d0f2580352f640e794cf95624f9b32d0 (table test-workload [id=36b89bd33e2b4f068e2cb64fca3c13b6])
I20251212 21:10:35.949474 24840 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251212 21:10:35.949617 24840 catalog_manager.cc:1269] Loaded cluster ID: 9b84b6e4e0ca404cb9b70fc04d1c13f0
I20251212 21:10:35.949680 24840 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251212 21:10:35.949894 24840 catalog_manager.cc:1514] Loading token signing keys...
I20251212 21:10:35.950019 24840 catalog_manager.cc:6038] T 00000000000000000000000000000000 P 68cde708327e4da5bdb1432082ba3f7b: Loaded TSK: 0
I20251212 21:10:35.950194 24840 catalog_manager.cc:1524] Initializing in-progress tserver states...
W20251212 21:10:35.982091 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251212 21:10:35.984130 24784 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "0ecacd125c104184b71910c5597cef64" instance_seqno: 1765573832088817) as {username='slave'} at 127.23.110.130:44887; Asking this server to re-register.
I20251212 21:10:35.986804 24352 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:35.986896 24352 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:35.987558 24784 ts_manager.cc:194] Registered new tserver with Master: 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585)
I20251212 21:10:36.008005 24784 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" instance_seqno: 1765573832214143) as {username='slave'} at 127.23.110.131:46085; Asking this server to re-register.
I20251212 21:10:36.009297 24483 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:36.009382 24483 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:36.010553 24784 ts_manager.cc:194] Registered new tserver with Master: 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
W20251212 21:10:36.120072 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251212 21:10:36.129019 24749 consensus_queue.cc:799] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 2 ops behind the committed index 
W20251212 21:10:36.150519 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251212 21:10:36.182133 24748 consensus_queue.cc:799] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 23 ops behind the committed index 
I20251212 21:10:36.189504 24740 consensus_queue.cc:799] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 29 ops behind the committed index 
I20251212 21:10:36.199018 24745 consensus_queue.cc:799] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 22 ops behind the committed index 
W20251212 21:10:36.203555 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251212 21:10:36.204241 24740 consensus_queue.cc:799] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 28 ops behind the committed index 
W20251212 21:10:36.220145 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251212 21:10:36.232527 24748 consensus_queue.cc:799] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 32 ops behind the committed index 
W20251212 21:10:36.321836 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
W20251212 21:10:36.519680 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 36: this message will repeat every 5th retry.
I20251212 21:10:36.591351 24784 master_service.cc:438] Got heartbeat from unknown tserver (permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" instance_seqno: 1765573832543103) as {username='slave'} at 127.23.110.132:48275; Asking this server to re-register.
I20251212 21:10:36.592118 24726 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:36.592208 24726 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:36.593019 24784 ts_manager.cc:194] Registered new tserver with Master: d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853)
W20251212 21:10:36.629573 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251212 21:10:36.745492 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251212 21:10:36.747028 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251212 21:10:36.795290 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251212 21:10:36.805356 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251212 21:10:37.053520 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 41: this message will repeat every 5th retry.
W20251212 21:10:37.123988 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251212 21:10:37.219805 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251212 21:10:37.296940 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251212 21:10:37.342376 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251212 21:10:37.346220 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251212 21:10:37.632064 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251212 21:10:37.635604 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 46: this message will repeat every 5th retry.
W20251212 21:10:37.677822 24373 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111) [suppressed 294 similar messages]
W20251212 21:10:37.698994 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251212 21:10:37.846298 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251212 21:10:37.863106 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251212 21:10:37.867249 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251212 21:10:38.133405 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251212 21:10:38.172083 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 51: this message will repeat every 5th retry.
W20251212 21:10:38.202677 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251212 21:10:38.391875 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251212 21:10:38.398676 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251212 21:10:38.425594 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251212 21:10:38.611485 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251212 21:10:38.679203 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 56: this message will repeat every 5th retry.
W20251212 21:10:38.688146 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251212 21:10:38.863754 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
I20251212 21:10:38.925766 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:41523
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:46669
--webserver_port=41077
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:34329
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--follower_unavailable_considered_failed_sec=2
--enable_log_gc=false with env {}
W20251212 21:10:38.959864 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251212 21:10:38.967314 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 61: this message will repeat every 5th retry.
W20251212 21:10:39.066737 24851 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:39.066951 24851 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:39.066977 24851 flags.cc:425] Enabled unsafe flag: --enable_log_gc=false
W20251212 21:10:39.067004 24851 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:39.068531 24851 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:39.068596 24851 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:10:39.070683 24851 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:34329
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--follower_unavailable_considered_failed_sec=2
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:41523
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=41077
--enable_log_gc=false
--tserver_master_addrs=127.23.110.190:46669
--never_fsync=true
--heap_profile_path=/tmp/kudu.24851
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:39.071169 24851 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:39.071507 24851 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:39.075798 24856 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:39.077941 24859 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:39.078984 24851 server_base.cc:1047] running on GCE node
W20251212 21:10:39.086014 24857 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:39.086848 24851 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:39.087162 24851 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:39.090641 24851 hybrid_clock.cc:648] HybridClock initialized: now 1765573839090548 us; error 114 us; skew 500 ppm
I20251212 21:10:39.093429 24851 webserver.cc:492] Webserver started at http://127.23.110.129:41077/ using document root <none> and password file <none>
I20251212 21:10:39.093688 24851 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:39.093912 24851 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:39.098220 24851 fs_manager.cc:714] Time spent opening directory manager: real 0.003s	user 0.000s	sys 0.001s
I20251212 21:10:39.101368 24865 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:39.101549 24851 fs_manager.cc:730] Time spent opening block manager: real 0.002s	user 0.000s	sys 0.001s
I20251212 21:10:39.101624 24851 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "f1d79b00d76349faa9c59b372f7877ba"
format_stamp: "Formatted at 2025-12-12 21:10:31 on dist-test-slave-rz82"
I20251212 21:10:39.101990 24851 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:10:39.127197 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 66: this message will repeat every 5th retry.
I20251212 21:10:39.135588 24851 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:39.136008 24851 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:39.136169 24851 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:39.136395 24851 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:39.137115 24872 ts_tablet_manager.cc:542] Loading tablet metadata (0/6 complete)
I20251212 21:10:39.137357 24749 consensus_queue.cc:799] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 1416 ops behind the committed index  [suppressed 27 similar messages]
I20251212 21:10:39.140451 24851 ts_tablet_manager.cc:585] Loaded tablet metadata (6 total tablets, 6 live tablets)
I20251212 21:10:39.140520 24851 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.004s	user 0.000s	sys 0.000s
I20251212 21:10:39.140549 24851 ts_tablet_manager.cc:600] Registering tablets (0/6 complete)
I20251212 21:10:39.141294 24872 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:39.142374 24851 ts_tablet_manager.cc:616] Registered 6 tablets
I20251212 21:10:39.142457 24851 ts_tablet_manager.cc:595] Time spent register tablets: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:39.155933 24872 log.cc:826] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:39.158187 24851 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:41523
I20251212 21:10:39.158715 24851 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:10:39.159162 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 24851
I20251212 21:10:39.166337 24980 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:41523 every 8 connection(s)
I20251212 21:10:39.167138 24981 heartbeater.cc:344] Connected to a master server at 127.23.110.190:46669
I20251212 21:10:39.167239 24981 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:39.167510 24981 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:39.168212 24784 ts_manager.cc:194] Registered new tserver with Master: f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523)
I20251212 21:10:39.169214 24784 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:40277
I20251212 21:10:39.186941 24872 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap replayed 1/1 log segments. Stats: ops{read=99 overwritten=0 applied=96 ignored=0} inserts{seen=766 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:10:39.187361 24872 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap complete.
I20251212 21:10:39.188338 24872 ts_tablet_manager.cc:1403] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.047s	user 0.009s	sys 0.015s
I20251212 21:10:39.189544 24872 raft_consensus.cc:359] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.192490 24872 raft_consensus.cc:740] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:39.193022 24872 consensus_queue.cc:260] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 96, Last appended: 1.99, Last appended by leader: 99, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.193681 24872 ts_tablet_manager.cc:1434] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.005s	user 0.003s	sys 0.001s
I20251212 21:10:39.193681 24981 heartbeater.cc:499] Master 127.23.110.190:46669 was elected leader, sending a full tablet report...
I20251212 21:10:39.193848 24872 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:39.194351 24582 consensus_queue.cc:799] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 1437 ops behind the committed index  [suppressed 27 similar messages]
I20251212 21:10:39.232023 24919 raft_consensus.cc:3060] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Advancing to term 2
I20251212 21:10:39.234062 24919 pending_rounds.cc:85] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Aborting all ops after (but not including) 98
I20251212 21:10:39.234233 24919 pending_rounds.cc:107] T 074cc25aad644982ae7d025718ae01f5 P f1d79b00d76349faa9c59b372f7877ba: Aborting uncommitted WRITE_OP operation due to leader change: 1.99
W20251212 21:10:39.256548 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: INITIALIZED. This is attempt 66: this message will repeat every 5th retry.
I20251212 21:10:39.267827 24582 consensus_queue.cc:799] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 1465 ops behind the committed index  [suppressed 28 similar messages]
I20251212 21:10:39.277607 24738 consensus_queue.cc:799] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 1456 ops behind the committed index  [suppressed 28 similar messages]
I20251212 21:10:39.279364 24872 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap replayed 1/1 log segments. Stats: ops{read=98 overwritten=0 applied=98 ignored=0} inserts{seen=777 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:39.279673 24872 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap complete.
I20251212 21:10:39.280685 24872 ts_tablet_manager.cc:1403] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.087s	user 0.012s	sys 0.007s
I20251212 21:10:39.280846 24872 raft_consensus.cc:359] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.280920 24872 raft_consensus.cc:740] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:39.282636 24872 consensus_queue.cc:260] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 98, Last appended: 1.98, Last appended by leader: 98, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.282920 24872 ts_tablet_manager.cc:1434] T d0f2580352f640e794cf95624f9b32d0 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:39.283018 24872 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:39.303051 24988 consensus_queue.cc:799] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 1470 ops behind the committed index  [suppressed 29 similar messages]
I20251212 21:10:39.312669 24990 mvcc.cc:204] Tried to move back new op lower bound from 7231790420132880384 to 7231790419708575744. Current Snapshot: MvccSnapshot[applied={T|T < 7231790420132880384}]
I20251212 21:10:39.334234 24872 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap replayed 1/1 log segments. Stats: ops{read=98 overwritten=0 applied=98 ignored=0} inserts{seen=797 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:39.334576 24872 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap complete.
I20251212 21:10:39.335489 24872 ts_tablet_manager.cc:1403] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.052s	user 0.010s	sys 0.008s
I20251212 21:10:39.335624 24872 raft_consensus.cc:359] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:39.335690 24872 raft_consensus.cc:740] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:39.336936 24872 consensus_queue.cc:260] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 98, Last appended: 1.98, Last appended by leader: 98, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } }
I20251212 21:10:39.340312 24872 ts_tablet_manager.cc:1434] T a3ecd74d94aa4867aab9e37f1675cc27 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.005s	user 0.001s	sys 0.001s
I20251212 21:10:39.341080 24872 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
I20251212 21:10:39.354604 24533 consensus_queue.cc:799] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Peer f1d79b00d76349faa9c59b372f7877ba is lagging by at least 1474 ops behind the committed index  [suppressed 30 similar messages]
I20251212 21:10:39.384955 24872 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap replayed 1/1 log segments. Stats: ops{read=98 overwritten=0 applied=98 ignored=0} inserts{seen=799 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:39.385426 24872 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Bootstrap complete.
I20251212 21:10:39.386533 24872 ts_tablet_manager.cc:1403] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.045s	user 0.011s	sys 0.006s
I20251212 21:10:39.386755 24872 raft_consensus.cc:359] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.386906 24872 raft_consensus.cc:740] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:39.387018 24872 consensus_queue.cc:260] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 98, Last appended: 1.98, Last appended by leader: 98, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.387764 24872 ts_tablet_manager.cc:1434] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:39.388263 24872 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
W20251212 21:10:39.411119 24982 tablet_replica_mm_ops.cc:318] Log GC is disabled (check --enable_log_gc)
W20251212 21:10:39.412951 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20251212 21:10:39.436405 24872 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Bootstrap replayed 1/1 log segments. Stats: ops{read=98 overwritten=0 applied=98 ignored=0} inserts{seen=832 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:39.436857 24872 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Bootstrap complete.
I20251212 21:10:39.438022 24872 ts_tablet_manager.cc:1403] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.050s	user 0.011s	sys 0.007s
I20251212 21:10:39.451359 24872 raft_consensus.cc:359] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.451586 24872 raft_consensus.cc:740] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:39.451699 24872 consensus_queue.cc:260] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 98, Last appended: 1.98, Last appended by leader: 98, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.451884 24872 ts_tablet_manager.cc:1434] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:39.451974 24872 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Bootstrap starting.
W20251212 21:10:39.508826 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 66: this message will repeat every 5th retry.
I20251212 21:10:39.552471 24872 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Bootstrap replayed 1/1 log segments. Stats: ops{read=98 overwritten=0 applied=98 ignored=0} inserts{seen=851 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:39.589617 24872 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Bootstrap complete.
I20251212 21:10:39.593793 24872 ts_tablet_manager.cc:1403] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Time spent bootstrapping tablet: real 0.142s	user 0.012s	sys 0.008s
I20251212 21:10:39.594064 24872 raft_consensus.cc:359] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.594215 24872 raft_consensus.cc:740] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: f1d79b00d76349faa9c59b372f7877ba, State: Initialized, Role: FOLLOWER
I20251212 21:10:39.594327 24872 consensus_queue.cc:260] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 98, Last appended: 1.98, Last appended by leader: 98, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.594527 24872 ts_tablet_manager.cc:1434] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba: Time spent starting tablet: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:39.810263 24991 raft_consensus.cc:493] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 2547ecb5f0e74137add24df5b3fbac48)
I20251212 21:10:39.812904 24991 raft_consensus.cc:515] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:39.813334 24991 leader_election.cc:290] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585), 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:39.847792 24301 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "b47a8b191555448b9dd7ec692a253b73" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 2 candidate_status { last_received { term: 1 index: 1667 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:39.918722 24437 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "b47a8b191555448b9dd7ec692a253b73" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 2 candidate_status { last_received { term: 1 index: 1667 } } ignore_live_leader: false dest_uuid: "2547ecb5f0e74137add24df5b3fbac48" is_pre_election: true
I20251212 21:10:39.921900 24869 leader_election.cc:304] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f1d79b00d76349faa9c59b372f7877ba; no voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48
I20251212 21:10:40.032550 25017 raft_consensus.cc:493] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 2547ecb5f0e74137add24df5b3fbac48)
I20251212 21:10:40.032999 25017 raft_consensus.cc:515] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:40.041553 25017 leader_election.cc:290] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585), 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:40.041520 24304 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "0ea0cdb48a6640879d462b9e52dcbcbd" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 2 candidate_status { last_received { term: 1 index: 3207 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:40.041899 24437 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "0ea0cdb48a6640879d462b9e52dcbcbd" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 2 candidate_status { last_received { term: 1 index: 3207 } } ignore_live_leader: false dest_uuid: "2547ecb5f0e74137add24df5b3fbac48" is_pre_election: true
I20251212 21:10:40.042161 24869 leader_election.cc:304] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f1d79b00d76349faa9c59b372f7877ba; no voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48
I20251212 21:10:40.042326 25017 raft_consensus.cc:2749] T 0ea0cdb48a6640879d462b9e52dcbcbd P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251212 21:10:40.121953 24991 raft_consensus.cc:2749] T b47a8b191555448b9dd7ec692a253b73 P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251212 21:10:40.155742 25016 raft_consensus.cc:493] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 2547ecb5f0e74137add24df5b3fbac48)
I20251212 21:10:40.155848 25016 raft_consensus.cc:515] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } }
I20251212 21:10:40.156112 25016 leader_election.cc:290] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585), 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:40.156622 24437 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9831c0f3d91045398ce4c2183f7db85d" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 2 candidate_status { last_received { term: 1 index: 3227 } } ignore_live_leader: false dest_uuid: "2547ecb5f0e74137add24df5b3fbac48" is_pre_election: true
I20251212 21:10:40.156638 24300 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "9831c0f3d91045398ce4c2183f7db85d" candidate_uuid: "f1d79b00d76349faa9c59b372f7877ba" candidate_term: 2 candidate_status { last_received { term: 1 index: 3227 } } ignore_live_leader: false dest_uuid: "0ecacd125c104184b71910c5597cef64" is_pre_election: true
I20251212 21:10:40.157035 24869 leader_election.cc:304] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f1d79b00d76349faa9c59b372f7877ba; no voters: 0ecacd125c104184b71910c5597cef64, 2547ecb5f0e74137add24df5b3fbac48
I20251212 21:10:40.157205 25016 raft_consensus.cc:2749] T 9831c0f3d91045398ce4c2183f7db85d P f1d79b00d76349faa9c59b372f7877ba [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20251212 21:10:40.197889 24784 ts_manager.cc:284] Unset tserver state for f1d79b00d76349faa9c59b372f7877ba from MAINTENANCE_MODE
I20251212 21:10:40.199872 24981 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:40.595943 24726 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:40.912165 24483 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:40.996543 24352 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:43.211095 24776 ts_manager.cc:295] Set tserver state for f1d79b00d76349faa9c59b372f7877ba to MAINTENANCE_MODE
I20251212 21:10:43.211513 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24851
W20251212 21:10:43.228639 24373 connection.cc:537] client connection to 127.23.110.129:41523 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:43.228720 24373 proxy.cc:239] Call had error, refreshing address and retrying: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107) [suppressed 85 similar messages]
W20251212 21:10:43.229709 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:43.229768 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:43.229795 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:43.229815 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:43.229831 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:43.229851 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:43.702673 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:43.724215 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:43.725832 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:43.748142 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:43.765888 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:43.786471 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
W20251212 21:10:44.189352 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:44.221613 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:44.271086 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:44.277329 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:44.292201 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:44.313041 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 11: this message will repeat every 5th retry.
W20251212 21:10:44.673976 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:44.718452 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:44.726845 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:44.765334 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:44.808051 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:44.819715 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 16: this message will repeat every 5th retry.
W20251212 21:10:45.153008 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:45.238291 25028 consensus_queue.cc:579] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.029s)
I20251212 21:10:45.243080 24988 consensus_queue.cc:579] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.035s)
I20251212 21:10:45.248389 25024 consensus_queue.cc:579] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.040s)
W20251212 21:10:45.249958 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251212 21:10:45.255525 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:45.256975 25028 consensus_queue.cc:579] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.049s)
I20251212 21:10:45.271940 25028 consensus_queue.cc:579] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.064s)
W20251212 21:10:45.280961 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:45.329672 24746 consensus_queue.cc:579] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Leader has been unable to successfully communicate with peer f1d79b00d76349faa9c59b372f7877ba for more than 2 seconds (2.121s)
W20251212 21:10:45.336596 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
W20251212 21:10:45.336843 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 21: this message will repeat every 5th retry.
I20251212 21:10:45.523536 24776 ts_manager.cc:284] Unset tserver state for f1d79b00d76349faa9c59b372f7877ba from MAINTENANCE_MODE
I20251212 21:10:45.600258 24726 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
W20251212 21:10:45.681602 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:45.779551 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:45.790108 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:45.813956 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:45.819717 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
W20251212 21:10:45.865391 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 26: this message will repeat every 5th retry.
I20251212 21:10:46.009799 24352 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
W20251212 21:10:46.179258 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251212 21:10:46.276482 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251212 21:10:46.284494 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
W20251212 21:10:46.312495 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 31: this message will repeat every 5th retry.
I20251212 21:10:46.334584 24483 heartbeater.cc:507] Master 127.23.110.190:46669 requested a full tablet report, sending...
I20251212 21:10:46.350946 24435 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 4 messages since previous log ~7 seconds ago
I20251212 21:10:46.351109 24436 consensus_queue.cc:237] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6100, Committed index: 6100, Last appended: 2.6100, Last appended by leader: 98, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6101 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } }
I20251212 21:10:46.351172 24435 consensus_queue.cc:237] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6095, Committed index: 6095, Last appended: 1.6096, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6097 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } }
I20251212 21:10:46.351548 24434 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 2 messages since previous log ~6 seconds ago
I20251212 21:10:46.351742 24434 consensus_queue.cc:237] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6099, Committed index: 6099, Last appended: 1.6099, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } }
I20251212 21:10:46.352011 24433 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 2 messages since previous log ~7 seconds ago
I20251212 21:10:46.352180 24433 consensus_queue.cc:237] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6099, Committed index: 6099, Last appended: 1.6099, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } }
I20251212 21:10:46.353057 24300 raft_consensus.cc:1275] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 1 index: 6096. Preceding OpId from leader: term: 1 index: 6097. (index mismatch)
I20251212 21:10:46.353214 24300 raft_consensus.cc:1275] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 1 index: 6099. Preceding OpId from leader: term: 1 index: 6100. (index mismatch)
I20251212 21:10:46.353371 24300 raft_consensus.cc:1275] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 1 index: 6099. Preceding OpId from leader: term: 1 index: 6100. (index mismatch)
W20251212 21:10:46.353482 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.353539 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.353575 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:46.353631 24746 consensus_queue.cc:1048] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6097, Last known committed idx: 6095, Time since last communication: 0.000s
I20251212 21:10:46.353668 24747 consensus_queue.cc:1048] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6100, Last known committed idx: 6099, Time since last communication: 0.000s
I20251212 21:10:46.353740 24746 consensus_queue.cc:1048] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6100, Last known committed idx: 6099, Time since last communication: 0.000s
I20251212 21:10:46.354614 25026 raft_consensus.cc:2955] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Committing config change with OpId 1.6100: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.355161 24437 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 2 messages since previous log ~7 seconds ago
I20251212 21:10:46.355321 24437 consensus_queue.cc:237] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6099, Committed index: 6099, Last appended: 1.6102, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6103 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } }
I20251212 21:10:46.355971 24772 catalog_manager.cc:5167] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 0ea0cdb48a6640879d462b9e52dcbcbd with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251212 21:10:46.355939 24776 catalog_manager.cc:5654] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:46.356109 25028 raft_consensus.cc:2955] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Committing config change with OpId 1.6100: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
W20251212 21:10:46.356261 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:46.356731 24303 raft_consensus.cc:2955] T 0ea0cdb48a6640879d462b9e52dcbcbd P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Committing config change with OpId 1.6100: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.356112 24988 raft_consensus.cc:2955] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Committing config change with OpId 1.6097: config changed from index -1 to 6097, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6097 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.357467 24432 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~7 seconds ago
I20251212 21:10:46.357671 24432 consensus_queue.cc:237] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6099, Committed index: 6099, Last appended: 1.6099, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } }
I20251212 21:10:46.358139 24306 raft_consensus.cc:1275] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 1 index: 6100. Preceding OpId from leader: term: 1 index: 6103. (index mismatch)
I20251212 21:10:46.358410 24776 catalog_manager.cc:5654] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:46.358459 24303 raft_consensus.cc:2955] T 9831c0f3d91045398ce4c2183f7db85d P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Committing config change with OpId 1.6097: config changed from index -1 to 6097, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6097 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
W20251212 21:10:46.359193 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:46.359483 24772 catalog_manager.cc:5167] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet b47a8b191555448b9dd7ec692a253b73 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251212 21:10:46.359571 24772 catalog_manager.cc:5167] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 9831c0f3d91045398ce4c2183f7db85d with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251212 21:10:46.359858 24303 raft_consensus.cc:2955] T b47a8b191555448b9dd7ec692a253b73 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Committing config change with OpId 1.6100: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.360069 24776 catalog_manager.cc:5654] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: config changed from index -1 to 6097, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: 6097 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20251212 21:10:46.360661 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer f1d79b00d76349faa9c59b372f7877ba (127.23.110.129:41523): Couldn't send request to peer f1d79b00d76349faa9c59b372f7877ba. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.129:41523: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:46.360946 24306 raft_consensus.cc:1275] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 1 index: 6099. Preceding OpId from leader: term: 1 index: 6100. (index mismatch)
I20251212 21:10:46.361029 24300 raft_consensus.cc:1275] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 2 FOLLOWER]: Refusing update from remote peer 2547ecb5f0e74137add24df5b3fbac48: Log matching property violated. Preceding OpId in replica: term: 2 index: 6100. Preceding OpId from leader: term: 2 index: 6101. (index mismatch)
I20251212 21:10:46.361277 25047 consensus_queue.cc:1048] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6103, Last known committed idx: 6099, Time since last communication: 0.000s
I20251212 21:10:46.361441 25024 consensus_queue.cc:1048] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6100, Last known committed idx: 6099, Time since last communication: 0.000s
I20251212 21:10:46.361490 24987 consensus_queue.cc:1048] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6101, Last known committed idx: 6100, Time since last communication: 0.000s
I20251212 21:10:46.363057 25028 raft_consensus.cc:2955] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Committing config change with OpId 1.6103: config changed from index -1 to 6103, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6103 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.363684 24300 raft_consensus.cc:2955] T d0f2580352f640e794cf95624f9b32d0 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Committing config change with OpId 1.6103: config changed from index -1 to 6103, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6103 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.364225 24776 catalog_manager.cc:5654] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: config changed from index -1 to 6103, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: 6103 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:46.364832 24772 catalog_manager.cc:5167] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet d0f2580352f640e794cf95624f9b32d0 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251212 21:10:46.365095 25047 raft_consensus.cc:2955] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 [term 2 LEADER]: Committing config change with OpId 2.6101: config changed from index -1 to 6101, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6101 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.366050 24988 raft_consensus.cc:2955] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 [term 1 LEADER]: Committing config change with OpId 1.6100: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.366321 24300 raft_consensus.cc:2955] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 [term 1 FOLLOWER]: Committing config change with OpId 1.6100: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.366750 24776 catalog_manager.cc:5654] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 reported cstate change: config changed from index -1 to 6101, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New cstate: current_term: 2 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: 6101 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20251212 21:10:46.366990 24373 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853): Couldn't send request to peer d4ac390a2b804f15815d2ba3f4919d61. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: b47a8b191555448b9dd7ec692a253b73. This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.367043 24373 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853): Couldn't send request to peer d4ac390a2b804f15815d2ba3f4919d61. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 0ea0cdb48a6640879d462b9e52dcbcbd. This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.367069 24373 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853): Couldn't send request to peer d4ac390a2b804f15815d2ba3f4919d61. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: d0f2580352f640e794cf95624f9b32d0. This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.367093 24373 consensus_peers.cc:597] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853): Couldn't send request to peer d4ac390a2b804f15815d2ba3f4919d61. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: a3ecd74d94aa4867aab9e37f1675cc27. This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.367116 24373 consensus_peers.cc:597] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853): Couldn't send request to peer d4ac390a2b804f15815d2ba3f4919d61. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 074cc25aad644982ae7d025718ae01f5. This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.367142 24373 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132:40853): Couldn't send request to peer d4ac390a2b804f15815d2ba3f4919d61. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 9831c0f3d91045398ce4c2183f7db85d. This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:46.367336 24772 catalog_manager.cc:5167] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 074cc25aad644982ae7d025718ae01f5 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251212 21:10:46.367405 24772 catalog_manager.cc:5167] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet a3ecd74d94aa4867aab9e37f1675cc27 with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20251212 21:10:46.369316 24784 catalog_manager.cc:5654] T a3ecd74d94aa4867aab9e37f1675cc27 P 0ecacd125c104184b71910c5597cef64 reported cstate change: config changed from index -1 to 6100, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New cstate: current_term: 1 leader_uuid: "2547ecb5f0e74137add24df5b3fbac48" committed_config { opid_index: 6100 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.370539 24303 raft_consensus.cc:2955] T 074cc25aad644982ae7d025718ae01f5 P 0ecacd125c104184b71910c5597cef64 [term 2 FOLLOWER]: Committing config change with OpId 2.6101: config changed from index -1 to 6101, NON_VOTER d4ac390a2b804f15815d2ba3f4919d61 (127.23.110.132) added. New config: { opid_index: 6101 OBSOLETE_local: false peers { permanent_uuid: "f1d79b00d76349faa9c59b372f7877ba" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 41523 } } peers { permanent_uuid: "0ecacd125c104184b71910c5597cef64" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 35585 } } peers { permanent_uuid: "2547ecb5f0e74137add24df5b3fbac48" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 35821 } } peers { permanent_uuid: "d4ac390a2b804f15815d2ba3f4919d61" member_type: NON_VOTER last_known_addr { host: "127.23.110.132" port: 40853 } attrs { promote: true } } }
I20251212 21:10:46.433917 25061 ts_tablet_manager.cc:933] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: Initiating tablet copy from peer 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:46.435110 25061 tablet_copy_client.cc:323] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Beginning tablet copy session from remote peer at address 127.23.110.131:35821
I20251212 21:10:46.439878 24457 tablet_copy_service.cc:140] P 2547ecb5f0e74137add24df5b3fbac48: Received BeginTabletCopySession request for tablet 0ea0cdb48a6640879d462b9e52dcbcbd from peer d4ac390a2b804f15815d2ba3f4919d61 ({username='slave'} at 127.23.110.132:56237)
I20251212 21:10:46.439966 24457 tablet_copy_service.cc:161] P 2547ecb5f0e74137add24df5b3fbac48: Beginning new tablet copy session on tablet 0ea0cdb48a6640879d462b9e52dcbcbd from peer d4ac390a2b804f15815d2ba3f4919d61 at {username='slave'} at 127.23.110.132:56237: session id = d4ac390a2b804f15815d2ba3f4919d61-0ea0cdb48a6640879d462b9e52dcbcbd
I20251212 21:10:46.440625 24457 tablet_copy_source_session.cc:215] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48: Tablet Copy: opened 0 blocks and 1 log segments
I20251212 21:10:46.442363 25061 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0ea0cdb48a6640879d462b9e52dcbcbd. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.445086 25061 tablet_copy_client.cc:806] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 0 data blocks...
I20251212 21:10:46.446424 25061 tablet_copy_client.cc:670] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 1 WAL segments...
I20251212 21:10:46.445518 25064 ts_tablet_manager.cc:933] T d0f2580352f640e794cf95624f9b32d0 P d4ac390a2b804f15815d2ba3f4919d61: Initiating tablet copy from peer 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:46.448668 25064 tablet_copy_client.cc:323] T d0f2580352f640e794cf95624f9b32d0 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Beginning tablet copy session from remote peer at address 127.23.110.131:35821
I20251212 21:10:46.449976 24456 tablet_copy_service.cc:140] P 2547ecb5f0e74137add24df5b3fbac48: Received BeginTabletCopySession request for tablet d0f2580352f640e794cf95624f9b32d0 from peer d4ac390a2b804f15815d2ba3f4919d61 ({username='slave'} at 127.23.110.132:56237)
I20251212 21:10:46.450034 24456 tablet_copy_service.cc:161] P 2547ecb5f0e74137add24df5b3fbac48: Beginning new tablet copy session on tablet d0f2580352f640e794cf95624f9b32d0 from peer d4ac390a2b804f15815d2ba3f4919d61 at {username='slave'} at 127.23.110.132:56237: session id = d4ac390a2b804f15815d2ba3f4919d61-d0f2580352f640e794cf95624f9b32d0
I20251212 21:10:46.450604 24456 tablet_copy_source_session.cc:215] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48: Tablet Copy: opened 0 blocks and 1 log segments
I20251212 21:10:46.452353 25066 ts_tablet_manager.cc:933] T b47a8b191555448b9dd7ec692a253b73 P d4ac390a2b804f15815d2ba3f4919d61: Initiating tablet copy from peer 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:46.453214 25066 tablet_copy_client.cc:323] T b47a8b191555448b9dd7ec692a253b73 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Beginning tablet copy session from remote peer at address 127.23.110.131:35821
I20251212 21:10:46.453220 25064 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet d0f2580352f640e794cf95624f9b32d0. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.456055 25064 tablet_copy_client.cc:806] T d0f2580352f640e794cf95624f9b32d0 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 0 data blocks...
I20251212 21:10:46.456173 25064 tablet_copy_client.cc:670] T d0f2580352f640e794cf95624f9b32d0 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 1 WAL segments...
I20251212 21:10:46.456557 24456 tablet_copy_service.cc:140] P 2547ecb5f0e74137add24df5b3fbac48: Received BeginTabletCopySession request for tablet b47a8b191555448b9dd7ec692a253b73 from peer d4ac390a2b804f15815d2ba3f4919d61 ({username='slave'} at 127.23.110.132:56237)
I20251212 21:10:46.456617 24456 tablet_copy_service.cc:161] P 2547ecb5f0e74137add24df5b3fbac48: Beginning new tablet copy session on tablet b47a8b191555448b9dd7ec692a253b73 from peer d4ac390a2b804f15815d2ba3f4919d61 at {username='slave'} at 127.23.110.132:56237: session id = d4ac390a2b804f15815d2ba3f4919d61-b47a8b191555448b9dd7ec692a253b73
I20251212 21:10:46.457158 24456 tablet_copy_source_session.cc:215] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48: Tablet Copy: opened 0 blocks and 1 log segments
I20251212 21:10:46.468912 25061 tablet_copy_client.cc:538] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251212 21:10:46.470116 25061 tablet_bootstrap.cc:492] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: Bootstrap starting.
I20251212 21:10:46.474874 25066 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet b47a8b191555448b9dd7ec692a253b73. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.476788 25066 tablet_copy_client.cc:806] T b47a8b191555448b9dd7ec692a253b73 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 0 data blocks...
I20251212 21:10:46.476948 25066 tablet_copy_client.cc:670] T b47a8b191555448b9dd7ec692a253b73 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 1 WAL segments...
I20251212 21:10:46.477054 25070 ts_tablet_manager.cc:933] T 9831c0f3d91045398ce4c2183f7db85d P d4ac390a2b804f15815d2ba3f4919d61: Initiating tablet copy from peer 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:46.477301 25070 tablet_copy_client.cc:323] T 9831c0f3d91045398ce4c2183f7db85d P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Beginning tablet copy session from remote peer at address 127.23.110.131:35821
I20251212 21:10:46.477517 25069 ts_tablet_manager.cc:933] T 074cc25aad644982ae7d025718ae01f5 P d4ac390a2b804f15815d2ba3f4919d61: Initiating tablet copy from peer 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:46.477521 24456 tablet_copy_service.cc:140] P 2547ecb5f0e74137add24df5b3fbac48: Received BeginTabletCopySession request for tablet 9831c0f3d91045398ce4c2183f7db85d from peer d4ac390a2b804f15815d2ba3f4919d61 ({username='slave'} at 127.23.110.132:56237)
I20251212 21:10:46.477588 24456 tablet_copy_service.cc:161] P 2547ecb5f0e74137add24df5b3fbac48: Beginning new tablet copy session on tablet 9831c0f3d91045398ce4c2183f7db85d from peer d4ac390a2b804f15815d2ba3f4919d61 at {username='slave'} at 127.23.110.132:56237: session id = d4ac390a2b804f15815d2ba3f4919d61-9831c0f3d91045398ce4c2183f7db85d
I20251212 21:10:46.477835 25069 tablet_copy_client.cc:323] T 074cc25aad644982ae7d025718ae01f5 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Beginning tablet copy session from remote peer at address 127.23.110.131:35821
I20251212 21:10:46.478019 25068 ts_tablet_manager.cc:933] T a3ecd74d94aa4867aab9e37f1675cc27 P d4ac390a2b804f15815d2ba3f4919d61: Initiating tablet copy from peer 2547ecb5f0e74137add24df5b3fbac48 (127.23.110.131:35821)
I20251212 21:10:46.478153 25068 tablet_copy_client.cc:323] T a3ecd74d94aa4867aab9e37f1675cc27 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Beginning tablet copy session from remote peer at address 127.23.110.131:35821
I20251212 21:10:46.478313 24456 tablet_copy_source_session.cc:215] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48: Tablet Copy: opened 0 blocks and 1 log segments
I20251212 21:10:46.478812 24455 tablet_copy_service.cc:140] P 2547ecb5f0e74137add24df5b3fbac48: Received BeginTabletCopySession request for tablet a3ecd74d94aa4867aab9e37f1675cc27 from peer d4ac390a2b804f15815d2ba3f4919d61 ({username='slave'} at 127.23.110.132:56237)
I20251212 21:10:46.478866 24455 tablet_copy_service.cc:161] P 2547ecb5f0e74137add24df5b3fbac48: Beginning new tablet copy session on tablet a3ecd74d94aa4867aab9e37f1675cc27 from peer d4ac390a2b804f15815d2ba3f4919d61 at {username='slave'} at 127.23.110.132:56237: session id = d4ac390a2b804f15815d2ba3f4919d61-a3ecd74d94aa4867aab9e37f1675cc27
I20251212 21:10:46.478861 25070 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9831c0f3d91045398ce4c2183f7db85d. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.479341 24455 tablet_copy_source_session.cc:215] T a3ecd74d94aa4867aab9e37f1675cc27 P 2547ecb5f0e74137add24df5b3fbac48: Tablet Copy: opened 0 blocks and 1 log segments
I20251212 21:10:46.480876 25070 tablet_copy_client.cc:806] T 9831c0f3d91045398ce4c2183f7db85d P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 0 data blocks...
I20251212 21:10:46.480999 25070 tablet_copy_client.cc:670] T 9831c0f3d91045398ce4c2183f7db85d P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 1 WAL segments...
I20251212 21:10:46.482052 24456 tablet_copy_service.cc:140] P 2547ecb5f0e74137add24df5b3fbac48: Received BeginTabletCopySession request for tablet 074cc25aad644982ae7d025718ae01f5 from peer d4ac390a2b804f15815d2ba3f4919d61 ({username='slave'} at 127.23.110.132:56237)
I20251212 21:10:46.482118 24456 tablet_copy_service.cc:161] P 2547ecb5f0e74137add24df5b3fbac48: Beginning new tablet copy session on tablet 074cc25aad644982ae7d025718ae01f5 from peer d4ac390a2b804f15815d2ba3f4919d61 at {username='slave'} at 127.23.110.132:56237: session id = d4ac390a2b804f15815d2ba3f4919d61-074cc25aad644982ae7d025718ae01f5
I20251212 21:10:46.482676 24456 tablet_copy_source_session.cc:215] T 074cc25aad644982ae7d025718ae01f5 P 2547ecb5f0e74137add24df5b3fbac48: Tablet Copy: opened 0 blocks and 1 log segments
I20251212 21:10:46.490216 25069 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 074cc25aad644982ae7d025718ae01f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.491791 25069 tablet_copy_client.cc:806] T 074cc25aad644982ae7d025718ae01f5 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 0 data blocks...
I20251212 21:10:46.491932 25069 tablet_copy_client.cc:670] T 074cc25aad644982ae7d025718ae01f5 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 1 WAL segments...
I20251212 21:10:46.492899 25068 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a3ecd74d94aa4867aab9e37f1675cc27. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.495329 25066 tablet_copy_client.cc:538] T b47a8b191555448b9dd7ec692a253b73 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251212 21:10:46.501592 25068 tablet_copy_client.cc:806] T a3ecd74d94aa4867aab9e37f1675cc27 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 0 data blocks...
I20251212 21:10:46.501881 25068 tablet_copy_client.cc:670] T a3ecd74d94aa4867aab9e37f1675cc27 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Starting download of 1 WAL segments...
I20251212 21:10:46.504444 25066 tablet_bootstrap.cc:492] T b47a8b191555448b9dd7ec692a253b73 P d4ac390a2b804f15815d2ba3f4919d61: Bootstrap starting.
I20251212 21:10:46.533191 25064 tablet_copy_client.cc:538] T d0f2580352f640e794cf95624f9b32d0 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251212 21:10:46.534174 25064 tablet_bootstrap.cc:492] T d0f2580352f640e794cf95624f9b32d0 P d4ac390a2b804f15815d2ba3f4919d61: Bootstrap starting.
I20251212 21:10:46.549794 25068 tablet_copy_client.cc:538] T a3ecd74d94aa4867aab9e37f1675cc27 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251212 21:10:46.550964 25068 tablet_bootstrap.cc:492] T a3ecd74d94aa4867aab9e37f1675cc27 P d4ac390a2b804f15815d2ba3f4919d61: Bootstrap starting.
I20251212 21:10:46.552613 25070 tablet_copy_client.cc:538] T 9831c0f3d91045398ce4c2183f7db85d P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251212 21:10:46.553681 25070 tablet_bootstrap.cc:492] T 9831c0f3d91045398ce4c2183f7db85d P d4ac390a2b804f15815d2ba3f4919d61: Bootstrap starting.
I20251212 21:10:46.562640 25069 tablet_copy_client.cc:538] T 074cc25aad644982ae7d025718ae01f5 P d4ac390a2b804f15815d2ba3f4919d61: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20251212 21:10:46.563623 25069 tablet_bootstrap.cc:492] T 074cc25aad644982ae7d025718ae01f5 P d4ac390a2b804f15815d2ba3f4919d61: Bootstrap starting.
I20251212 21:10:46.665391 25061 log.cc:826] T 0ea0cdb48a6640879d462b9e52dcbcbd P d4ac390a2b804f15815d2ba3f4919d61: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:46.780416 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24224
W20251212 21:10:46.796538 24372 connection.cc:537] client connection to 127.23.110.130:35585 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:10:46.797905 25038 negotiation.cc:337] Failed RPC negotiation. Trace:
1212 21:10:46.797228 (+     0us) reactor.cc:625] Submitting negotiation task for client connection to 127.23.110.130:35585 (local address 127.23.110.131:35109)
1212 21:10:46.797427 (+   199us) negotiation.cc:107] Waiting for socket to connect
1212 21:10:46.797440 (+    13us) client_negotiation.cc:174] Beginning negotiation
1212 21:10:46.797491 (+    51us) client_negotiation.cc:252] Sending NEGOTIATE NegotiatePB request
1212 21:10:46.797781 (+   290us) negotiation.cc:327] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.23.110.130:35585: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107)
Metrics: {"client-negotiator.queue_time_us":179}
W20251212 21:10:46.798216 24372 consensus_peers.cc:597] T 9831c0f3d91045398ce4c2183f7db85d P 2547ecb5f0e74137add24df5b3fbac48 -> Peer 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585): Couldn't send request to peer 0ecacd125c104184b71910c5597cef64. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.130:35585: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.798271 24372 consensus_peers.cc:597] T b47a8b191555448b9dd7ec692a253b73 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585): Couldn't send request to peer 0ecacd125c104184b71910c5597cef64. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.130:35585: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.798293 24372 consensus_peers.cc:597] T d0f2580352f640e794cf95624f9b32d0 P 2547ecb5f0e74137add24df5b3fbac48 -> Peer 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585): Couldn't send request to peer 0ecacd125c104184b71910c5597cef64. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.130:35585: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107). This is attempt 1: this message will repeat every 5th retry.
W20251212 21:10:46.798312 24372 consensus_peers.cc:597] T 0ea0cdb48a6640879d462b9e52dcbcbd P 2547ecb5f0e74137add24df5b3fbac48 -> Peer 0ecacd125c104184b71910c5597cef64 (127.23.110.130:35585): Couldn't send request to peer 0ecacd125c104184b71910c5597cef64. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.130:35585: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107). This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:46.799758 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24355
I20251212 21:10:46.815431 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24541
I20251212 21:10:46.820813 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 24754
2025-12-12T21:10:46Z chronyd exiting
[       OK ] MaintenanceModeRF3ITest.TestFailedTServerInMaintenanceModeDoesntRereplicate (15222 ms)
[----------] 1 test from MaintenanceModeRF3ITest (15222 ms total)

[----------] 1 test from RollingRestartArgs/RollingRestartITest
[ RUN      ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4
2025-12-12T21:10:46Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-12-12T21:10:46Z Disabled control of system clock
I20251212 21:10:46.867900 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.23.110.190:45865
--webserver_interface=127.23.110.190
--webserver_port=0
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.23.110.190:45865
--location_mapping_cmd=/tmp/dist-test-taskAaNqbA/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/location-assignment.state --map /L0:4
--master_client_location_assignment_enabled=false with env {}
W20251212 21:10:46.951023 25090 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:46.951220 25090 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:46.951244 25090 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:46.952770 25090 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20251212 21:10:46.952827 25090 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:46.952844 25090 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20251212 21:10:46.952860 25090 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20251212 21:10:46.954578 25090 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal
--location_mapping_cmd=/tmp/dist-test-taskAaNqbA/build/release/bin/testdata/assign-location.py --state_store=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/location-assignment.state --map /L0:4
--ipki_ca_key_size=768
--master_addresses=127.23.110.190:45865
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.23.110.190:45865
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.23.110.190
--webserver_port=0
--never_fsync=true
--heap_profile_path=/tmp/kudu.25090
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:46.954855 25090 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:46.955087 25090 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:46.958225 25096 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:46.958214 25095 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:46.958251 25090 server_base.cc:1047] running on GCE node
W20251212 21:10:46.958220 25098 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:46.958673 25090 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:46.958914 25090 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:46.960060 25090 hybrid_clock.cc:648] HybridClock initialized: now 1765573846960039 us; error 36 us; skew 500 ppm
I20251212 21:10:46.961295 25090 webserver.cc:492] Webserver started at http://127.23.110.190:37967/ using document root <none> and password file <none>
I20251212 21:10:46.961504 25090 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:46.961577 25090 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:46.961691 25090 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:46.962587 25090 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data/instance:
uuid: "538a5705efe74a76a6b00ecd6f0b3b73"
format_stamp: "Formatted at 2025-12-12 21:10:46 on dist-test-slave-rz82"
I20251212 21:10:46.962862 25090 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal/instance:
uuid: "538a5705efe74a76a6b00ecd6f0b3b73"
format_stamp: "Formatted at 2025-12-12 21:10:46 on dist-test-slave-rz82"
I20251212 21:10:46.964073 25090 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.003s
I20251212 21:10:46.965150 25104 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:46.965346 25090 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:46.965404 25090 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal
uuid: "538a5705efe74a76a6b00ecd6f0b3b73"
format_stamp: "Formatted at 2025-12-12 21:10:46 on dist-test-slave-rz82"
I20251212 21:10:46.965538 25090 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:46.977506 25090 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:46.977810 25090 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:46.977921 25090 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:46.981701 25090 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.190:45865
I20251212 21:10:46.981778 25156 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.190:45865 every 8 connection(s)
I20251212 21:10:46.982082 25090 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/data/info.pb
I20251212 21:10:46.982296 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 25090
I20251212 21:10:46.982401 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/master-0/wal/instance
I20251212 21:10:46.982731 25157 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:46.986230 25157 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73: Bootstrap starting.
I20251212 21:10:46.986932 25157 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:46.987229 25157 log.cc:826] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:46.987987 25157 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73: No bootstrap required, opened a new log
I20251212 21:10:46.989070 25157 raft_consensus.cc:359] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 45865 } }
I20251212 21:10:46.989199 25157 raft_consensus.cc:385] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:46.989226 25157 raft_consensus.cc:740] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 538a5705efe74a76a6b00ecd6f0b3b73, State: Initialized, Role: FOLLOWER
I20251212 21:10:46.989355 25157 consensus_queue.cc:260] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 45865 } }
I20251212 21:10:46.989411 25157 raft_consensus.cc:399] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20251212 21:10:46.989441 25157 raft_consensus.cc:493] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20251212 21:10:46.989493 25157 raft_consensus.cc:3060] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:46.990031 25157 raft_consensus.cc:515] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 45865 } }
I20251212 21:10:46.990147 25157 leader_election.cc:304] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 538a5705efe74a76a6b00ecd6f0b3b73; no voters: 
I20251212 21:10:46.990296 25157 leader_election.cc:290] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20251212 21:10:46.990379 25162 raft_consensus.cc:2804] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:46.990543 25157 sys_catalog.cc:565] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [sys.catalog]: configured and running, proceeding with master startup.
I20251212 21:10:46.990574 25162 raft_consensus.cc:697] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [term 1 LEADER]: Becoming Leader. State: Replica: 538a5705efe74a76a6b00ecd6f0b3b73, State: Running, Role: LEADER
I20251212 21:10:46.990696 25162 consensus_queue.cc:237] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 45865 } }
I20251212 21:10:46.991039 25164 sys_catalog.cc:455] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 538a5705efe74a76a6b00ecd6f0b3b73. Latest consensus state: current_term: 1 leader_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 45865 } } }
I20251212 21:10:46.991062 25163 sys_catalog.cc:455] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "538a5705efe74a76a6b00ecd6f0b3b73" member_type: VOTER last_known_addr { host: "127.23.110.190" port: 45865 } } }
I20251212 21:10:46.991237 25163 sys_catalog.cc:458] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [sys.catalog]: This master's current role is: LEADER
I20251212 21:10:46.991219 25164 sys_catalog.cc:458] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73 [sys.catalog]: This master's current role is: LEADER
I20251212 21:10:46.991521 25171 catalog_manager.cc:1485] Loading table and tablet metadata into memory...
I20251212 21:10:46.992061 25171 catalog_manager.cc:1494] Initializing Kudu cluster ID...
I20251212 21:10:46.993655 25171 catalog_manager.cc:1357] Generated new cluster ID: 90c720c07cc64b158f18a33652ce1aee
I20251212 21:10:46.993702 25171 catalog_manager.cc:1505] Initializing Kudu internal certificate authority...
I20251212 21:10:47.005039 25171 catalog_manager.cc:1380] Generated new certificate authority record
I20251212 21:10:47.005663 25171 catalog_manager.cc:1514] Loading token signing keys...
I20251212 21:10:47.009853 25171 catalog_manager.cc:6027] T 00000000000000000000000000000000 P 538a5705efe74a76a6b00ecd6f0b3b73: Generated new TSK 0
I20251212 21:10:47.010056 25171 catalog_manager.cc:1524] Initializing in-progress tserver states...
I20251212 21:10:47.012643 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:0
--local_ip_for_outbound_sockets=127.23.110.129
--webserver_interface=127.23.110.129
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:10:47.094547 25181 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:47.094755 25181 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:47.094785 25181 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:47.096378 25181 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:47.096462 25181 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:10:47.098313 25181 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.25181
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:47.098618 25181 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:47.098848 25181 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:47.101666 25187 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.101647 25186 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:47.101769 25181 server_base.cc:1047] running on GCE node
W20251212 21:10:47.101647 25189 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:47.102108 25181 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:47.102327 25181 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:47.103507 25181 hybrid_clock.cc:648] HybridClock initialized: now 1765573847103483 us; error 38 us; skew 500 ppm
I20251212 21:10:47.104862 25181 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:10:47.105093 25181 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:47.105144 25181 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:47.105296 25181 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:47.106247 25181 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/instance:
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.106696 25181 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal/instance:
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.108064 25181 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.003s
I20251212 21:10:47.108901 25195 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.109095 25181 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:47.109179 25181 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.109277 25181 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:47.122061 25181 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:47.122326 25181 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:47.122428 25181 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:47.122638 25181 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:47.122965 25181 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:47.123001 25181 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.123030 25181 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:47.123061 25181 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.129094 25181 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:10:47.129151 25308 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
I20251212 21:10:47.129572 25181 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:10:47.134313 25309 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:47.134426 25309 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:47.134644 25309 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:47.137586 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 25181
I20251212 21:10:47.137691 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal/instance
I20251212 21:10:47.139153 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:0
--local_ip_for_outbound_sockets=127.23.110.130
--webserver_interface=127.23.110.130
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:47.176293 25121 ts_manager.cc:194] Registered new tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:10:47.177049 25121 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:54963
W20251212 21:10:47.221907 25313 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:47.222101 25313 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:47.222131 25313 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:47.223726 25313 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:47.223799 25313 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:10:47.225536 25313 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.25313
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:47.225804 25313 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:47.226032 25313 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:47.228941 25318 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.228941 25321 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.229141 25319 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:47.229157 25313 server_base.cc:1047] running on GCE node
I20251212 21:10:47.229357 25313 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:47.229635 25313 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:47.230803 25313 hybrid_clock.cc:648] HybridClock initialized: now 1765573847230767 us; error 58 us; skew 500 ppm
I20251212 21:10:47.232087 25313 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:10:47.232291 25313 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:47.232336 25313 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:47.232437 25313 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:47.233379 25313 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/instance:
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.233698 25313 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal/instance:
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.234965 25313 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:47.235824 25327 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.236042 25313 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:10:47.236135 25313 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.236199 25313 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:47.293584 25313 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:47.293864 25313 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:47.293975 25313 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:47.294186 25313 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:47.294521 25313 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:47.294554 25313 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.294584 25313 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:47.294602 25313 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.299976 25313 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:10:47.300050 25440 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:10:47.300371 25313 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:10:47.304790 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 25313
I20251212 21:10:47.304912 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal/instance
I20251212 21:10:47.305511 25441 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:47.305634 25441 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:47.305878 25441 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:47.306030 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:0
--local_ip_for_outbound_sockets=127.23.110.131
--webserver_interface=127.23.110.131
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:47.341414 25121 ts_manager.cc:194] Registered new tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:47.342159 25121 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:53189
W20251212 21:10:47.387861 25444 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:47.388227 25444 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:47.388279 25444 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:47.390513 25444 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:47.390604 25444 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:10:47.392891 25444 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.25444
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:47.393168 25444 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:47.393421 25444 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:47.396492 25451 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.396579 25453 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.396518 25450 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:47.396701 25444 server_base.cc:1047] running on GCE node
I20251212 21:10:47.397197 25444 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:47.397486 25444 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:47.398705 25444 hybrid_clock.cc:648] HybridClock initialized: now 1765573847398674 us; error 46 us; skew 500 ppm
I20251212 21:10:47.400208 25444 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:10:47.400427 25444 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:47.400478 25444 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:47.400583 25444 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:47.401526 25444 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/instance:
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.401826 25444 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal/instance:
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.403069 25444 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251212 21:10:47.404060 25459 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.404318 25444 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:47.404407 25444 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.404556 25444 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:47.412235 25444 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:47.412575 25444 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:47.412688 25444 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:47.412897 25444 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:47.413281 25444 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:47.413321 25444 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.413350 25444 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:47.413373 25444 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.418991 25444 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:10:47.419054 25572 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:10:47.419345 25444 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:10:47.421135 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 25444
I20251212 21:10:47.421268 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal/instance
I20251212 21:10:47.422575 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:0
--local_ip_for_outbound_sockets=127.23.110.132
--webserver_interface=127.23.110.132
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:47.424628 25573 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:47.424731 25573 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:47.424947 25573 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:47.463501 25121 ts_manager.cc:194] Registered new tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:10:47.464183 25121 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:45831
W20251212 21:10:47.522122 25576 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:47.522325 25576 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:47.522354 25576 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:47.524248 25576 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:47.524362 25576 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:10:47.526338 25576 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=0
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.25576
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:47.526675 25576 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:47.526927 25576 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:47.530181 25582 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.530184 25585 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:47.530184 25583 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:47.530576 25576 server_base.cc:1047] running on GCE node
I20251212 21:10:47.530761 25576 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:47.530985 25576 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:47.532150 25576 hybrid_clock.cc:648] HybridClock initialized: now 1765573847532114 us; error 68 us; skew 500 ppm
I20251212 21:10:47.533833 25576 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:10:47.534057 25576 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:47.534111 25576 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:47.534219 25576 server_base.cc:895] This appears to be a new deployment of Kudu; creating new FS layout
I20251212 21:10:47.535141 25576 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/instance:
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.535454 25576 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal/instance:
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.536881 25576 fs_manager.cc:696] Time spent creating directory manager: real 0.001s	user 0.002s	sys 0.001s
I20251212 21:10:47.537861 25591 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.538077 25576 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:47.538156 25576 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:47.538226 25576 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:47.576915 25576 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:47.577216 25576 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:47.577378 25576 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:47.577674 25576 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:47.578028 25576 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:47.578063 25576 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.578094 25576 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:47.578117 25576 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:47.584254 25576 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:10:47.584300 25704 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:10:47.584616 25576 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:10:47.589280 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 25576
I20251212 21:10:47.589385 23994 external_mini_cluster.cc:1442] Reading /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal/instance
I20251212 21:10:47.589895 25705 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:47.590029 25705 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:47.590341 25705 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:47.636198 25121 ts_manager.cc:194] Registered new tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:10:47.636945 25121 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:33747
I20251212 21:10:47.637346 23994 external_mini_cluster.cc:949] 4 TS(s) registered with all masters
I20251212 21:10:47.645221 23994 test_util.cc:276] Using random seed: -1308394462
I20251212 21:10:47.653353 25120 catalog_manager.cc:2257] Servicing CreateTable request from {username='slave'} at 127.0.0.1:47316:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20251212 21:10:47.661430 25375 tablet_service.cc:1505] Processing CreateTablet for tablet edaf6af028f1466d9dccb7d78cf88122 (DEFAULT_TABLE table=test-workload [id=4c69f336a6e14576945328a36d809d4b]), partition=RANGE (key) PARTITION UNBOUNDED
I20251212 21:10:47.661877 25375 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet edaf6af028f1466d9dccb7d78cf88122. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:47.661867 25243 tablet_service.cc:1505] Processing CreateTablet for tablet edaf6af028f1466d9dccb7d78cf88122 (DEFAULT_TABLE table=test-workload [id=4c69f336a6e14576945328a36d809d4b]), partition=RANGE (key) PARTITION UNBOUNDED
I20251212 21:10:47.662163 25243 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet edaf6af028f1466d9dccb7d78cf88122. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:47.664072 25728 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
I20251212 21:10:47.664139 25729 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:10:47.664772 25728 tablet_bootstrap.cc:654] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:47.665089 25728 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:47.665138 25507 tablet_service.cc:1505] Processing CreateTablet for tablet edaf6af028f1466d9dccb7d78cf88122 (DEFAULT_TABLE table=test-workload [id=4c69f336a6e14576945328a36d809d4b]), partition=RANGE (key) PARTITION UNBOUNDED
I20251212 21:10:47.665472 25507 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet edaf6af028f1466d9dccb7d78cf88122. 1 dirs total, 0 dirs full, 0 dirs failed
I20251212 21:10:47.666033 25728 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: No bootstrap required, opened a new log
I20251212 21:10:47.666110 25728 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 0.002s	user 0.001s	sys 0.001s
I20251212 21:10:47.667734 25728 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.668146 25731 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:10:47.668352 25728 raft_consensus.cc:385] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:47.668524 25728 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:10:47.668696 25728 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.668889 25731 tablet_bootstrap.cc:654] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:47.669080 25728 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.003s	user 0.000s	sys 0.003s
I20251212 21:10:47.669308 25309 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:47.669641 25729 tablet_bootstrap.cc:654] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Neither blocks nor log segments found. Creating new log.
I20251212 21:10:47.669700 25731 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:47.670166 25729 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:47.670462 25731 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: No bootstrap required, opened a new log
I20251212 21:10:47.670534 25731 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:47.671063 25729 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: No bootstrap required, opened a new log
I20251212 21:10:47.671139 25729 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 0.007s	user 0.001s	sys 0.002s
I20251212 21:10:47.672097 25731 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.672235 25731 raft_consensus.cc:385] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:47.672253 25731 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:10:47.672336 25731 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.672591 25731 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:47.672650 25573 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:47.672705 25729 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.672849 25729 raft_consensus.cc:385] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20251212 21:10:47.672884 25729 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:10:47.672978 25729 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.673210 25729 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.002s
I20251212 21:10:47.673288 25441 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:47.697625 25735 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:47.697800 25735 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.698222 25735 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:47.701383 25263 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:10:47.701567 25263 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c941504e89314a6a868d59585d254b81 in term 0.
I20251212 21:10:47.701997 25395 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3" is_pre_election: true
I20251212 21:10:47.702132 25395 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c941504e89314a6a868d59585d254b81 in term 0.
I20251212 21:10:47.702046 25461 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, c941504e89314a6a868d59585d254b81; no voters: 
I20251212 21:10:47.702312 25735 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20251212 21:10:47.702382 25735 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:47.702399 25735 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:47.703115 25735 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.703270 25735 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 1 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:47.703465 25263 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:10:47.703505 25395 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:10:47.703558 25263 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:47.703570 25395 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 0 FOLLOWER]: Advancing to term 1
I20251212 21:10:47.704346 25395 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c941504e89314a6a868d59585d254b81 in term 1.
I20251212 21:10:47.704345 25263 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate c941504e89314a6a868d59585d254b81 in term 1.
I20251212 21:10:47.704758 25462 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: c941504e89314a6a868d59585d254b81, dd9e48fb810447718c09aca5a01b0fe3; no voters: 
I20251212 21:10:47.704918 25735 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 FOLLOWER]: Leader election won for term 1
I20251212 21:10:47.705204 25735 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 LEADER]: Becoming Leader. State: Replica: c941504e89314a6a868d59585d254b81, State: Running, Role: LEADER
I20251212 21:10:47.705380 25735 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:47.706655 25120 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 reported cstate change: term changed from 0 to 1, leader changed from <none> to c941504e89314a6a868d59585d254b81 (127.23.110.131). New cstate: current_term: 1 leader_uuid: "c941504e89314a6a868d59585d254b81" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:47.720279 23994 maintenance_mode-itest.cc:745] Restarting batch of 4 tservers: dd9e48fb810447718c09aca5a01b0fe3,35867ec45b8041d48fc8c7bb132375c5,3d78cc34680848ddacf9620033efe712,c941504e89314a6a868d59585d254b81
I20251212 21:10:47.756752 25395 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 1 FOLLOWER]: Refusing update from remote peer c941504e89314a6a868d59585d254b81: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:47.757105 25263 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 1 FOLLOWER]: Refusing update from remote peer c941504e89314a6a868d59585d254b81: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20251212 21:10:47.757639 25735 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [LEADER]: Connected to new peer: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:47.758138 25735 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [LEADER]: Connected to new peer: Peer: permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20251212 21:10:47.772616 25758 mvcc.cc:204] Tried to move back new op lower bound from 7231790480411475968 to 7231790480201957376. Current Snapshot: MvccSnapshot[applied={T|T < 7231790480356954112}]
I20251212 21:10:47.773926 25756 mvcc.cc:204] Tried to move back new op lower bound from 7231790480411475968 to 7231790480201957376. Current Snapshot: MvccSnapshot[applied={T|T < 7231790480356954112}]
I20251212 21:10:47.873752 25112 ts_manager.cc:295] Set tserver state for c941504e89314a6a868d59585d254b81 to MAINTENANCE_MODE
I20251212 21:10:47.895473 25112 ts_manager.cc:295] Set tserver state for 3d78cc34680848ddacf9620033efe712 to MAINTENANCE_MODE
I20251212 21:10:47.976713 25112 ts_manager.cc:295] Set tserver state for 35867ec45b8041d48fc8c7bb132375c5 to MAINTENANCE_MODE
I20251212 21:10:48.115518 25639 tablet_service.cc:1460] Tablet server 3d78cc34680848ddacf9620033efe712 set to quiescing
I20251212 21:10:48.115603 25639 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:48.135808 25507 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:10:48.135893 25507 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:48.136094 25771 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: : Instructing follower 35867ec45b8041d48fc8c7bb132375c5 to start an election
I20251212 21:10:48.136256 25771 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 LEADER]: Signalling peer 35867ec45b8041d48fc8c7bb132375c5 to start an election
I20251212 21:10:48.136575 25262 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
 from {username='slave'} at 127.23.110.131:51099
I20251212 21:10:48.136675 25262 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:10:48.136704 25262 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 1 FOLLOWER]: Advancing to term 2
I20251212 21:10:48.137535 25262 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:48.138243 25262 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 2 election: Requested vote from peers c941504e89314a6a868d59585d254b81 (127.23.110.131:33221), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:48.138794 25262 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 FOLLOWER]: Rejecting Update request from peer c941504e89314a6a868d59585d254b81 for earlier term 1. Current term is 2. Ops: []
I20251212 21:10:48.139048 25770 consensus_queue.cc:1059] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: INVALID_TERM, Last received: 1.219, Next index: 220, Last known committed idx: 218, Time since last communication: 0.000s
I20251212 21:10:48.139150 25770 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 LEADER]: Stepping down as leader of term 1
I20251212 21:10:48.139176 25770 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Running, Role: LEADER
I20251212 21:10:48.139225 25770 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 219, Committed index: 219, Last appended: 1.219, Last appended by leader: 219, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:48.139338 25770 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 1 FOLLOWER]: Advancing to term 2
I20251212 21:10:48.141820 25527 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 219 } } ignore_live_leader: true dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:10:48.142743 25527 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 35867ec45b8041d48fc8c7bb132375c5 in term 2.
W20251212 21:10:48.142971 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.143217 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:48.143198 25199 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, c941504e89314a6a868d59585d254b81; no voters: 
I20251212 21:10:48.143368 25732 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 FOLLOWER]: Leader election won for term 2
W20251212 21:10:48.143397 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:48.143496 25732 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 LEADER]: Becoming Leader. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Running, Role: LEADER
I20251212 21:10:48.143577 25732 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 218, Committed index: 218, Last appended: 1.219, Last appended by leader: 219, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:48.144196 25112 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 reported cstate change: term changed from 1 to 2, leader changed from c941504e89314a6a868d59585d254b81 (127.23.110.131) to 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129). New cstate: current_term: 2 leader_uuid: "35867ec45b8041d48fc8c7bb132375c5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:48.148785 25394 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 2 candidate_status { last_received { term: 1 index: 219 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:10:48.148979 25394 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 1 FOLLOWER]: Advancing to term 2
I20251212 21:10:48.149626 25527 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 2 FOLLOWER]: Refusing update from remote peer 35867ec45b8041d48fc8c7bb132375c5: Log matching property violated. Preceding OpId in replica: term: 1 index: 219. Preceding OpId from leader: term: 2 index: 222. (index mismatch)
I20251212 21:10:48.150103 25732 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 220, Last known committed idx: 219, Time since last communication: 0.000s
I20251212 21:10:48.150148 25394 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 35867ec45b8041d48fc8c7bb132375c5 in term 2.
I20251212 21:10:48.150244 25395 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 2 FOLLOWER]: Refusing update from remote peer 35867ec45b8041d48fc8c7bb132375c5: Log matching property violated. Preceding OpId in replica: term: 1 index: 216. Preceding OpId from leader: term: 2 index: 222. (index mismatch)
I20251212 21:10:48.150560 25732 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 220, Last known committed idx: 216, Time since last communication: 0.000s
I20251212 21:10:48.213289 25755 mvcc.cc:204] Tried to move back new op lower bound from 7231790482022002688 to 7231790481996836864. Current Snapshot: MvccSnapshot[applied={T|T < 7231790482017640448}]
I20251212 21:10:48.217334 25243 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:10:48.217448 25243 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:48.221782 25732 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: : Instructing follower dd9e48fb810447718c09aca5a01b0fe3 to start an election
I20251212 21:10:48.221876 25732 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 LEADER]: Signalling peer dd9e48fb810447718c09aca5a01b0fe3 to start an election
I20251212 21:10:48.222255 25394 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
 from {username='slave'} at 127.23.110.129:43987
I20251212 21:10:48.222368 25394 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 2 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:10:48.222395 25394 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 2 FOLLOWER]: Advancing to term 3
I20251212 21:10:48.223197 25394 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:48.223611 25394 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 3 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:10:48.226015 25395 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Rejecting Update request from peer 35867ec45b8041d48fc8c7bb132375c5 for earlier term 2. Current term is 3. Ops: [2.260-2.260]
I20251212 21:10:48.227917 25828 consensus_queue.cc:1059] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 }, Status: INVALID_TERM, Last received: 2.259, Next index: 260, Last known committed idx: 259, Time since last communication: 0.000s
I20251212 21:10:48.228063 25828 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 LEADER]: Stepping down as leader of term 2
I20251212 21:10:48.228092 25828 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 LEADER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Running, Role: LEADER
I20251212 21:10:48.228142 25828 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 259, Committed index: 259, Last appended: 2.260, Last appended by leader: 260, Current term: 2, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:48.228241 25828 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 2 FOLLOWER]: Advancing to term 3
I20251212 21:10:48.231025 25262 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 3 candidate_status { last_received { term: 2 index: 259 } } ignore_live_leader: true dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:10:48.231163 25262 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Leader election vote request: Denying vote to candidate dd9e48fb810447718c09aca5a01b0fe3 for term 3 because replica has last-logged OpId of term: 2 index: 260, which is greater than that of the candidate, which has last-logged OpId of term: 2 index: 259.
W20251212 21:10:48.232172 25756 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.010s ago.
I20251212 21:10:48.233006 25527 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 3 candidate_status { last_received { term: 2 index: 259 } } ignore_live_leader: true dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:10:48.233158 25527 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 2 FOLLOWER]: Advancing to term 3
W20251212 21:10:48.233219 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:48.234061 25527 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 3 FOLLOWER]: Leader election vote request: Denying vote to candidate dd9e48fb810447718c09aca5a01b0fe3 for term 3 because replica has last-logged OpId of term: 2 index: 260, which is greater than that of the candidate, which has last-logged OpId of term: 2 index: 259.
I20251212 21:10:48.234290 25331 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 3 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: dd9e48fb810447718c09aca5a01b0fe3; no voters: 35867ec45b8041d48fc8c7bb132375c5, c941504e89314a6a868d59585d254b81
I20251212 21:10:48.234556 25736 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Leader election lost for term 3. Reason: could not achieve majority
W20251212 21:10:48.235261 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.237428 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.239285 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.241613 25355 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.245438 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:48.245618 25112 ts_manager.cc:295] Set tserver state for dd9e48fb810447718c09aca5a01b0fe3 to MAINTENANCE_MODE
W20251212 21:10:48.248706 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.253384 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.255568 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.262269 25355 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.263373 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.272485 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.276156 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.280794 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.284878 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.289727 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.293892 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.300524 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.305542 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.313427 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.317723 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.327647 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.330622 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.344137 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.348336 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.361462 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.365530 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.380571 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.380582 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.397133 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.399255 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:48.407408 25375 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:10:48.407488 25375 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:10:48.416209 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.417124 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.434890 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.434890 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.457147 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.457957 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.477865 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.479962 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.500921 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.502893 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.527323 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.527329 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.530990 25771 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:48.551419 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.551419 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.576432 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.578465 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.590265 25826 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:48.605155 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.605967 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.632969 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.635007 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:48.638108 25705 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
W20251212 21:10:48.652186 25736 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:48.661978 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.664076 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.692497 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.694617 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.722448 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.725585 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.755584 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.757617 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.792207 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.792240 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.826059 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.827050 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.860694 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.860716 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.895265 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.895265 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.930212 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.931255 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.969394 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:48.969394 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.010155 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.010960 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.048915 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.053131 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.090058 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.092163 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.132872 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.133482 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.174374 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.177587 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.220484 25354 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.220492 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.267198 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.267199 25223 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.288847 25507 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:10:49.288921 25507 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:10:49.312263 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.314282 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.359331 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.362399 25353 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33342: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.373553 25243 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:10:49.373620 25243 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:10:49.408893 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.411953 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.429072 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25313
I20251212 21:10:49.434705 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:36255
--local_ip_for_outbound_sockets=127.23.110.130
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=40621
--webserver_interface=127.23.110.130
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:10:49.457010 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.462220 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.506269 25716 meta_cache.cc:302] tablet edaf6af028f1466d9dccb7d78cf88122: replica dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255) has failed: Network error: Client connection negotiation failed: client connection to 127.23.110.130:36255: connect: Connection refused (error 111)
W20251212 21:10:49.514977 25221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45678: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.517657 25881 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:49.517861 25881 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:49.517889 25881 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:49.519809 25881 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:49.519881 25881 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:10:49.521693 25881 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:36255
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=40621
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.25881
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:49.521955 25881 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:49.522183 25881 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:49.525053 25890 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.525106 25887 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:49.525192 25881 server_base.cc:1047] running on GCE node
W20251212 21:10:49.525053 25888 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:49.525734 25881 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:49.525960 25881 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:49.527112 25881 hybrid_clock.cc:648] HybridClock initialized: now 1765573849527097 us; error 33 us; skew 500 ppm
I20251212 21:10:49.528303 25881 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:10:49.528492 25881 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:49.528558 25881 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:49.529888 25881 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:49.530622 25896 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:49.530783 25881 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:10:49.530865 25881 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:49.531121 25881 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:49.543537 25881 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:49.543797 25881 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:49.543900 25881 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:49.544099 25881 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:49.544538 25903 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:10:49.545470 25881 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:10:49.545570 25881 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:49.545614 25881 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:10:49.546188 25881 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:10:49.546224 25881 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:49.546299 25903 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:10:49.553109 25881 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:10:49.553148 26010 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:10:49.553508 25881 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:10:49.558362 26011 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:49.558475 26011 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:49.558671 26011 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:49.559206 25114 ts_manager.cc:194] Re-registered known tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:49.559221 25903 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:49.559619 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:47815
I20251212 21:10:49.559830 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 25881
I20251212 21:10:49.559911 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25181
I20251212 21:10:49.566102 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:42339
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34247
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:49.597182 25903 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 1/1 log segments. Stats: ops{read=259 overwritten=0 applied=259 ignored=0} inserts{seen=12850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:49.597594 25903 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap complete.
I20251212 21:10:49.598372 25903 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 0.052s	user 0.033s	sys 0.015s
I20251212 21:10:49.599162 25903 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:49.599381 25903 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:10:49.599472 25903 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 259, Last appended: 2.259, Last appended by leader: 259, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:49.599725 25903 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.001s	user 0.002s	sys 0.001s
I20251212 21:10:49.599807 26011 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
W20251212 21:10:49.618582 25487 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.639101 25744 scanner-internal.cc:458] Time spent opening tablet: real 1.204s	user 0.001s	sys 0.000s
W20251212 21:10:49.639261 25743 scanner-internal.cc:458] Time spent opening tablet: real 1.205s	user 0.001s	sys 0.001s
W20251212 21:10:49.640920 25745 scanner-internal.cc:458] Time spent opening tablet: real 1.203s	user 0.001s	sys 0.000s
W20251212 21:10:49.652412 26017 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:49.652667 26017 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:49.652704 26017 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:49.655148 26017 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:49.655253 26017 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:10:49.657938 26017 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:42339
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=34247
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26017
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:49.658252 26017 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:49.658572 26017 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:49.662096 26027 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.662225 26025 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:49.662572 26017 server_base.cc:1047] running on GCE node
W20251212 21:10:49.662225 26024 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:49.662930 26017 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:49.663162 26017 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:49.664333 26017 hybrid_clock.cc:648] HybridClock initialized: now 1765573849664294 us; error 49 us; skew 500 ppm
I20251212 21:10:49.665932 26017 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:10:49.666163 26017 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:49.666229 26017 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
W20251212 21:10:49.667678 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.667785 26017 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:49.668648 26033 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:49.668880 26017 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:49.668956 26017 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:49.669317 26017 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:49.697376 26017 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:49.697651 26017 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:49.697762 26017 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:49.697983 26017 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:49.698411 26040 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:10:49.699252 26017 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:10:49.699296 26017 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:49.699321 26017 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:10:49.699843 26017 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:10:49.699883 26017 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:49.699944 26040 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
I20251212 21:10:49.707609 26017 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:10:49.708117 26017 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:10:49.709465 26147 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
I20251212 21:10:49.712168 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 26017
I20251212 21:10:49.712280 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25576
I20251212 21:10:49.715512 26040 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:49.718340 26148 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:49.718475 26148 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:49.718719 26148 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:49.719285 25114 ts_manager.cc:194] Re-registered known tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:10:49.719480 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:41841
--local_ip_for_outbound_sockets=127.23.110.132
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34523
--webserver_interface=127.23.110.132
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:49.719825 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:45049
W20251212 21:10:49.722137 25924 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33366: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.724098 25924 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33366: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.760143 26040 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 1/1 log segments. Stats: ops{read=260 overwritten=0 applied=259 ignored=0} inserts{seen=12850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:10:49.760526 26040 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap complete.
I20251212 21:10:49.761379 26040 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 0.061s	user 0.042s	sys 0.016s
I20251212 21:10:49.762248 26040 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:49.762881 26040 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:10:49.763038 26040 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 259, Last appended: 2.260, Last appended by leader: 260, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:49.763299 26148 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:49.763288 26040 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.002s	user 0.001s	sys 0.001s
W20251212 21:10:49.780057 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.783958 26062 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45736: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.828760 26152 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:49.828949 26152 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:49.828970 26152 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:49.830646 26152 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:49.830730 26152 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:10:49.832760 26152 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:41841
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=34523
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26152
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:49.833060 26152 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:49.833400 26152 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:49.836936 26062 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45736: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.837376 26161 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.837455 26163 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.837726 26160 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.838886 25486 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42230: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.839648 26152 server_base.cc:1047] running on GCE node
I20251212 21:10:49.839845 26152 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:49.840097 26152 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:49.841274 26152 hybrid_clock.cc:648] HybridClock initialized: now 1765573849841229 us; error 53 us; skew 500 ppm
I20251212 21:10:49.842550 26152 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:10:49.842744 26152 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:49.842805 26152 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:49.844130 26152 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:49.845230 26169 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:49.845523 26152 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:49.845818 26152 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:49.846241 26152 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:49.874974 26152 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:49.875231 26152 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:49.875324 26152 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:49.875555 26152 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:49.875898 26152 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:49.875929 26152 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:49.875952 26152 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:49.875967 26152 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:49.881726 26152 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:10:49.881783 26282 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:10:49.882083 26152 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:10:49.885979 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 26152
I20251212 21:10:49.886093 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25444
I20251212 21:10:49.890132 26283 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:49.890259 26283 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:49.890498 26283 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:49.890938 25114 ts_manager.cc:194] Re-registered known tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:10:49.891418 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:59493
I20251212 21:10:49.891655 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:33221
--local_ip_for_outbound_sockets=127.23.110.131
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=43311
--webserver_interface=127.23.110.131
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:10:49.894711 25923 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33366: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.894834 26062 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45736: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:49.945590 26018 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:49.945780 26018 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:49.946090 26018 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
W20251212 21:10:49.946468 25900 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
W20251212 21:10:49.946967 25900 leader_election.cc:336] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 4 pre-election: RPC error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
I20251212 21:10:49.949486 26102 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 4 candidate_status { last_received { term: 2 index: 259 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:10:49.949677 26102 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate dd9e48fb810447718c09aca5a01b0fe3 for term 4 because replica has last-logged OpId of term: 2 index: 260, which is greater than that of the candidate, which has last-logged OpId of term: 2 index: 259.
I20251212 21:10:49.949939 25898 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: dd9e48fb810447718c09aca5a01b0fe3; no voters: 35867ec45b8041d48fc8c7bb132375c5, c941504e89314a6a868d59585d254b81
I20251212 21:10:49.950111 26018 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Leader pre-election lost for term 4. Reason: could not achieve majority
W20251212 21:10:49.955451 26062 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45736: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:49.973697 26286 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:49.973881 26286 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:49.973910 26286 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:49.975462 26286 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:49.975553 26286 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:10:49.977331 26286 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:33221
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=43311
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26286
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:49.977610 26286 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:49.977844 26286 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:49.980913 26293 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.980960 26294 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:49.980929 26296 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:49.981019 26286 server_base.cc:1047] running on GCE node
I20251212 21:10:49.981405 26286 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:49.981673 26286 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:49.982811 26286 hybrid_clock.cc:648] HybridClock initialized: now 1765573849982800 us; error 29 us; skew 500 ppm
I20251212 21:10:49.984247 26286 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:10:49.984440 26286 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:49.984505 26286 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:49.985852 26286 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:49.986632 26302 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:49.986837 26286 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:49.986917 26286 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:49.987212 26286 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:49.999966 26154 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:50.000120 26154 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:50.000424 26154 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers c941504e89314a6a868d59585d254b81 (127.23.110.131:33221), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
W20251212 21:10:50.000934 26037 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
W20251212 21:10:50.001592 26037 leader_election.cc:336] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 4 pre-election: RPC error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
I20251212 21:10:50.002698 26286 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:50.002941 26286 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:50.003046 26286 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:50.003261 26286 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:50.003724 26311 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:10:50.003746 25965 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 4 candidate_status { last_received { term: 2 index: 260 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3" is_pre_election: true
I20251212 21:10:50.003887 25965 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 35867ec45b8041d48fc8c7bb132375c5 in term 3.
I20251212 21:10:50.004070 26036 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:10:50.004202 26154 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Leader pre-election won for term 4
I20251212 21:10:50.004290 26154 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:50.004315 26154 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 3 FOLLOWER]: Advancing to term 4
I20251212 21:10:50.004777 26286 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:10:50.004842 26286 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:50.004882 26286 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:10:50.005411 26154 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:50.005533 26286 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:10:50.005560 26154 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 4 election: Requested vote from peers c941504e89314a6a868d59585d254b81 (127.23.110.131:33221), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:50.005574 26286 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:50.005788 26311 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:10:50.006035 25965 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 4 candidate_status { last_received { term: 2 index: 260 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:10:50.006122 25965 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 3 FOLLOWER]: Advancing to term 4
W20251212 21:10:50.006285 26037 leader_election.cc:336] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 4 election: RPC error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
I20251212 21:10:50.007393 25965 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 4 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 35867ec45b8041d48fc8c7bb132375c5 in term 4.
I20251212 21:10:50.007611 26036 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 4 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:10:50.007742 26154 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 FOLLOWER]: Leader election won for term 4
I20251212 21:10:50.007862 26154 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 LEADER]: Becoming Leader. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Running, Role: LEADER
I20251212 21:10:50.007961 26154 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 259, Committed index: 259, Last appended: 2.260, Last appended by leader: 260, Current term: 4, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:50.008701 25114 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 reported cstate change: term changed from 2 to 4. New cstate: current_term: 4 leader_uuid: "35867ec45b8041d48fc8c7bb132375c5" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: UNKNOWN } } }
I20251212 21:10:50.011804 26286 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:10:50.011940 26424 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:10:50.012171 26286 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:10:50.016644 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 26286
I20251212 21:10:50.020102 26311 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:50.038180 26425 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:50.038301 26425 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:50.038527 26425 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:50.039170 25114 ts_manager.cc:194] Re-registered known tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:10:50.039705 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:42395
W20251212 21:10:50.079535 25924 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33366: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:50.081609 25924 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33366: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:50.082052 26311 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 1/1 log segments. Stats: ops{read=260 overwritten=0 applied=259 ignored=0} inserts{seen=12850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:10:50.082508 26311 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap complete.
I20251212 21:10:50.083575 26311 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 0.078s	user 0.049s	sys 0.023s
I20251212 21:10:50.084702 26311 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 3 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:50.085464 26311 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:10:50.085587 26311 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 259, Last appended: 2.260, Last appended by leader: 260, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:50.085835 26311 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.000s
I20251212 21:10:50.085870 26425 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:50.088943 26378 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 3 FOLLOWER]: Advancing to term 4
I20251212 21:10:50.089118 25965 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 4 FOLLOWER]: Refusing update from remote peer 35867ec45b8041d48fc8c7bb132375c5: Log matching property violated. Preceding OpId in replica: term: 2 index: 259. Preceding OpId from leader: term: 4 index: 261. (index mismatch)
I20251212 21:10:50.089435 26154 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 261, Last known committed idx: 259, Time since last communication: 0.000s
I20251212 21:10:50.090286 26378 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 4 FOLLOWER]: Refusing update from remote peer 35867ec45b8041d48fc8c7bb132375c5: Log matching property violated. Preceding OpId in replica: term: 2 index: 260. Preceding OpId from leader: term: 4 index: 261. (index mismatch)
I20251212 21:10:50.090706 26154 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 261, Last known committed idx: 259, Time since last communication: 0.000s
W20251212 21:10:50.093981 25924 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33366: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:50.195238 26217 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:50.196558 26355 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:50.200014 25945 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:50.202719 26082 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:50.378366 25114 ts_manager.cc:284] Unset tserver state for 35867ec45b8041d48fc8c7bb132375c5 from MAINTENANCE_MODE
I20251212 21:10:50.487924 25114 ts_manager.cc:284] Unset tserver state for c941504e89314a6a868d59585d254b81 from MAINTENANCE_MODE
I20251212 21:10:50.531082 25114 ts_manager.cc:284] Unset tserver state for dd9e48fb810447718c09aca5a01b0fe3 from MAINTENANCE_MODE
I20251212 21:10:50.541599 25114 ts_manager.cc:284] Unset tserver state for 3d78cc34680848ddacf9620033efe712 from MAINTENANCE_MODE
I20251212 21:10:50.770794 25114 ts_manager.cc:295] Set tserver state for c941504e89314a6a868d59585d254b81 to MAINTENANCE_MODE
I20251212 21:10:50.846580 25114 ts_manager.cc:295] Set tserver state for 3d78cc34680848ddacf9620033efe712 to MAINTENANCE_MODE
I20251212 21:10:50.873890 25114 ts_manager.cc:295] Set tserver state for 35867ec45b8041d48fc8c7bb132375c5 to MAINTENANCE_MODE
I20251212 21:10:50.893748 26283 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:50.900198 25114 ts_manager.cc:295] Set tserver state for dd9e48fb810447718c09aca5a01b0fe3 to MAINTENANCE_MODE
I20251212 21:10:51.046169 26082 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:10:51.046238 26082 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:51.047657 26154 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: : Instructing follower c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:10:51.047730 26154 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 LEADER]: Signalling peer c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:10:51.048142 26336 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: : Instructing follower c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:10:51.048197 26336 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 LEADER]: Signalling peer c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:10:51.048660 26377 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "c941504e89314a6a868d59585d254b81"
 from {username='slave'} at 127.23.110.129:43491
I20251212 21:10:51.048776 26377 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 4 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:10:51.048833 26377 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 4 FOLLOWER]: Advancing to term 5
I20251212 21:10:51.049365 26378 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "c941504e89314a6a868d59585d254b81"
 from {username='slave'} at 127.23.110.129:43491
I20251212 21:10:51.049666 26377 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 5 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:51.049916 26377 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 5 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:51.050556 26378 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 5 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:10:51.050649 26378 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 5 FOLLOWER]: Advancing to term 6
I20251212 21:10:51.051404 26378 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:51.051604 26376 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Rejecting Update request from peer 35867ec45b8041d48fc8c7bb132375c5 for earlier term 4. Current term is 6. Ops: [4.840-4.840]
I20251212 21:10:51.051848 26378 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 6 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:51.052067 26154 consensus_queue.cc:1059] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: INVALID_TERM, Last received: 4.839, Next index: 840, Last known committed idx: 838, Time since last communication: 0.000s
I20251212 21:10:51.052170 26154 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 LEADER]: Stepping down as leader of term 4
I20251212 21:10:51.052201 26154 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 LEADER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Running, Role: LEADER
I20251212 21:10:51.052300 26154 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 842, Committed index: 842, Last appended: 4.842, Last appended by leader: 842, Current term: 4, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:51.052490 26154 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 4 FOLLOWER]: Advancing to term 6
I20251212 21:10:51.057325 26102 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 5 candidate_status { last_received { term: 4 index: 839 } } ignore_live_leader: true dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:10:51.057446 26102 raft_consensus.cc:2368] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for earlier term 5. Current term is 6.
I20251212 21:10:51.057581 26101 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 6 candidate_status { last_received { term: 4 index: 839 } } ignore_live_leader: true dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:10:51.057642 26101 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 6 because replica has last-logged OpId of term: 4 index: 842, which is greater than that of the candidate, which has last-logged OpId of term: 4 index: 839.
I20251212 21:10:51.058019 26304 leader_election.cc:400] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 5 election: Vote denied by peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339) with higher term. Message: Invalid argument: T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for earlier term 5. Current term is 6.
I20251212 21:10:51.058075 26304 leader_election.cc:403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 5 election: Cancelling election due to peer responding with higher term
I20251212 21:10:51.061350 25964 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 6 candidate_status { last_received { term: 4 index: 839 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:10:51.061458 25964 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 4 FOLLOWER]: Advancing to term 6
I20251212 21:10:51.061350 25965 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 5 candidate_status { last_received { term: 4 index: 839 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:10:51.062438 25964 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 6 because replica has last-logged OpId of term: 4 index: 842, which is greater than that of the candidate, which has last-logged OpId of term: 4 index: 839.
I20251212 21:10:51.062858 26305 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 6 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:10:51.063277 26587 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Leader election lost for term 5. Reason: Vote denied by peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339) with higher term. Message: Invalid argument: T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for earlier term 5. Current term is 6.
I20251212 21:10:51.063444 26587 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Leader election lost for term 6. Reason: could not achieve majority
I20251212 21:10:51.092947 26148 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:51.093837 26011 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:51.094141 26425 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:51.112579 26217 tablet_service.cc:1460] Tablet server 3d78cc34680848ddacf9620033efe712 set to quiescing
I20251212 21:10:51.112655 26217 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:51.137341 26355 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:10:51.137442 26355 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:51.182988 25945 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:10:51.183053 25945 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:10:51.321954 26154 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:51.354223 26617 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:51.543305 26586 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: failed to trigger leader election: Illegal state: leader elections are disabled
I20251212 21:10:52.194410 26082 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:10:52.194485 26082 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:52.249998 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25881
I20251212 21:10:52.255873 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:36255
--local_ip_for_outbound_sockets=127.23.110.130
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=40621
--webserver_interface=127.23.110.130
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:10:52.335340 26629 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:52.335539 26629 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:52.335572 26629 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:52.337035 26629 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:52.337107 26629 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:10:52.338707 26629 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:36255
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=40621
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26629
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:52.338968 26629 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:52.339195 26629 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:52.341979 26637 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.341990 26634 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.342034 26635 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:52.342270 26629 server_base.cc:1047] running on GCE node
I20251212 21:10:52.342451 26629 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:52.342648 26629 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:52.343789 26629 hybrid_clock.cc:648] HybridClock initialized: now 1765573852343761 us; error 35 us; skew 500 ppm
I20251212 21:10:52.344913 26629 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:10:52.345103 26629 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:52.345144 26629 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:52.346354 26629 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.347007 26643 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:52.347180 26629 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:10:52.347247 26629 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:52.347517 26629 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:52.358896 26629 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:52.359145 26629 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:52.359246 26629 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:52.359427 26629 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:52.359830 26650 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:10:52.360775 26629 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:10:52.360822 26629 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:52.360846 26629 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:10:52.361444 26629 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:10:52.361479 26629 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.361572 26650 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:10:52.368691 26629 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:10:52.368768 26757 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:10:52.369067 26629 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:10:52.370419 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 26629
I20251212 21:10:52.370548 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 26017
I20251212 21:10:52.374949 26758 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:52.375061 26758 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:52.375313 26758 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:52.375895 25116 ts_manager.cc:194] Re-registered known tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:52.376458 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:48513
W20251212 21:10:52.377071 25715 meta_cache.cc:302] tablet edaf6af028f1466d9dccb7d78cf88122: replica 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339) has failed: Network error: recv got EOF from 127.23.110.129:42339 (error 108)
I20251212 21:10:52.378428 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:42339
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34247
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:10:52.380411 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.380569 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.383769 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.384704 26650 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
W20251212 21:10:52.397408 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.399469 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.403676 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.425168 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.425168 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.429286 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.460194 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.463385 26762 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:52.463554 26762 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:52.463583 26762 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:52.465111 26762 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:52.465185 26762 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
W20251212 21:10:52.466334 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.466336 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.467136 26762 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:42339
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=34247
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26762
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:52.467381 26762 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:52.467605 26762 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:52.470307 26773 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.470294 26770 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.470377 26771 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:52.470674 26762 server_base.cc:1047] running on GCE node
I20251212 21:10:52.470832 26762 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:52.471025 26762 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:52.472179 26762 hybrid_clock.cc:648] HybridClock initialized: now 1765573852472161 us; error 31 us; skew 500 ppm
I20251212 21:10:52.473423 26762 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:10:52.473613 26762 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:52.473655 26762 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:52.474942 26762 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.475580 26779 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:52.475729 26762 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:10:52.475798 26762 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:52.476066 26762 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:10:52.488353 26762 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:52.488610 26762 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:52.488705 26762 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:52.488927 26762 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:52.489408 26786 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:10:52.490238 26762 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:10:52.490283 26762 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:52.490307 26762 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:10:52.490877 26762 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:10:52.490949 26762 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.490973 26786 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
I20251212 21:10:52.498087 26762 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:10:52.498574 26762 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:10:52.499082 26893 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
I20251212 21:10:52.509526 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 26762
I20251212 21:10:52.509635 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 26152
I20251212 21:10:52.518949 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:41841
--local_ip_for_outbound_sockets=127.23.110.132
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34523
--webserver_interface=127.23.110.132
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:52.520416 26786 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:52.522341 26894 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:52.522437 26894 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:52.522636 26894 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:52.523195 25116 ts_manager.cc:194] Re-registered known tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:10:52.523738 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:40807
I20251212 21:10:52.527493 26650 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 1/1 log segments. Stats: ops{read=842 overwritten=0 applied=842 ignored=0} inserts{seen=41950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:10:52.528049 26650 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap complete.
I20251212 21:10:52.529345 26650 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 0.168s	user 0.142s	sys 0.024s
I20251212 21:10:52.530748 26650 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.531157 26650 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:10:52.531288 26650 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 842, Last appended: 4.842, Last appended by leader: 842, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.531594 26758 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:52.531776 26650 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
W20251212 21:10:52.533231 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.534353 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.534349 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.553694 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.555060 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.555060 26670 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.561343 26333 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.561564 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.565544 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.574373 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.576496 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.579696 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.580294 26670 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.582383 26670 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.589546 26670 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.599329 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.601440 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.608044 26898 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:52.608213 26898 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:52.608234 26898 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:52.609759 26898 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:52.609817 26898 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
W20251212 21:10:52.610603 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.611562 26898 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:41841
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=34523
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.26898
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:52.611771 26898 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:52.612028 26898 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:10:52.612330 26670 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.612505 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.614751 26908 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.614751 26906 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.614815 26905 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:52.615604 26898 server_base.cc:1047] running on GCE node
I20251212 21:10:52.615789 26898 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:52.615998 26898 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:52.617161 26898 hybrid_clock.cc:648] HybridClock initialized: now 1765573852617144 us; error 29 us; skew 500 ppm
I20251212 21:10:52.618456 26898 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:10:52.618651 26898 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:52.618690 26898 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:52.619859 26898 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.620427 26914 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:52.620604 26898 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:52.620697 26898 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:52.621044 26898 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:10:52.622413 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.635365 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.638505 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.642158 26898 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:52.642439 26898 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:52.642565 26898 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:52.642809 26898 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:10:52.643188 26898 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:10:52.643232 26898 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:52.643263 26898 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:10:52.643287 26898 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
W20251212 21:10:52.644644 26332 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42250: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.648376 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.650596 26898 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:10:52.650692 27027 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:10:52.651100 26898 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
W20251212 21:10:52.653487 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.656121 27028 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:52.656219 27028 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:52.656406 27028 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:52.656754 25114 ts_manager.cc:194] Re-registered known tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:10:52.657040 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 26898
I20251212 21:10:52.657121 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 26286
I20251212 21:10:52.657207 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:34725
W20251212 21:10:52.659727 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.662962 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:33221
--local_ip_for_outbound_sockets=127.23.110.131
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=43311
--webserver_interface=127.23.110.131
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:10:52.678556 26786 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 1/1 log segments. Stats: ops{read=842 overwritten=0 applied=839 ignored=0} inserts{seen=41800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:10:52.678877 26786 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap complete.
I20251212 21:10:52.679766 26786 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 0.189s	user 0.134s	sys 0.041s
I20251212 21:10:52.680967 26786 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.681674 26786 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:10:52.681790 26786 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 839, Last appended: 4.842, Last appended by leader: 842, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.682030 26786 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
I20251212 21:10:52.682062 26894 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
W20251212 21:10:52.685475 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.694521 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.700806 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.703078 26790 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.718322 26790 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.734189 26790 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.741437 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.746924 27031 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:10:52.747124 27031 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:10:52.747161 27031 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:10:52.748678 27031 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:10:52.748762 27031 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:10:52.750452 27031 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:33221
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=43311
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27031
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:10:52.750762 27031 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:10:52.751015 27031 file_cache.cc:492] Constructed file cache file cache with capacity 419430
I20251212 21:10:52.754127 27031 server_base.cc:1047] running on GCE node
W20251212 21:10:52.754107 27041 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.754184 26671 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:52.754101 27039 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:10:52.754365 27038 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:10:52.754568 27031 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:10:52.754774 27031 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:10:52.755932 27031 hybrid_clock.cc:648] HybridClock initialized: now 1765573852755917 us; error 30 us; skew 500 ppm
I20251212 21:10:52.757118 27031 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:10:52.757349 27031 fs_manager.cc:362] Metadata directory not provided
I20251212 21:10:52.757397 27031 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:10:52.758666 27031 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.759382 27047 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:10:52.759567 27031 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:10:52.759646 27031 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:10:52.759975 27031 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:10:52.760373 26790 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.770068 26900 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:10:52.770210 26900 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.770526 26900 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 7 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
W20251212 21:10:52.771025 26647 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
W20251212 21:10:52.771525 26647 leader_election.cc:336] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 7 pre-election: RPC error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
W20251212 21:10:52.773677 26790 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:52.774039 26838 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 7 candidate_status { last_received { term: 4 index: 842 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:10:52.774176 26838 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 6.
I20251212 21:10:52.774343 26645 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 7 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:10:52.774459 26900 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Leader pre-election won for term 7
I20251212 21:10:52.774549 26900 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:10:52.774570 26900 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 6 FOLLOWER]: Advancing to term 7
I20251212 21:10:52.775820 26900 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.775982 26900 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 7 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:10:52.776167 26838 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 7 candidate_status { last_received { term: 4 index: 842 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:10:52.776242 26838 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 6 FOLLOWER]: Advancing to term 7
W20251212 21:10:52.776532 26647 leader_election.cc:336] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 7 election: RPC error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111)
I20251212 21:10:52.777477 26838 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 7 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 7.
I20251212 21:10:52.777655 26645 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 7 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:10:52.777782 26900 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 FOLLOWER]: Leader election won for term 7
I20251212 21:10:52.777925 26900 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 LEADER]: Becoming Leader. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:10:52.778060 26900 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 842, Committed index: 842, Last appended: 4.842, Last appended by leader: 842, Current term: 7, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:52.778800 25114 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 reported cstate change: term changed from 4 to 7, leader changed from 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129) to dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130). New cstate: current_term: 7 leader_uuid: "dd9e48fb810447718c09aca5a01b0fe3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: HEALTHY } } }
I20251212 21:10:52.783869 27031 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:10:52.784111 26838 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 7 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 4 index: 842. Preceding OpId from leader: term: 7 index: 844. (index mismatch)
I20251212 21:10:52.784193 27031 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:10:52.784312 27031 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:10:52.784533 26900 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 843, Last known committed idx: 839, Time since last communication: 0.000s
I20251212 21:10:52.784667 27031 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
W20251212 21:10:52.784708 26647 consensus_peers.cc:597] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 -> Peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Couldn't send request to peer c941504e89314a6a868d59585d254b81. Status: Network error: Client connection negotiation failed: client connection to 127.23.110.131:33221: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20251212 21:10:52.785295 27064 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:10:52.786299 27031 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:10:52.786348 27031 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:10:52.786373 27031 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:10:52.787092 27031 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:10:52.787168 27031 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:10:52.787204 27064 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:10:52.787638 27032 mvcc.cc:204] Tried to move back new op lower bound from 7231790501001412608 to 7231790500979388416. Current Snapshot: MvccSnapshot[applied={T|T < 7231790501001412608 or (T in {7231790501001412608})}]
I20251212 21:10:52.795238 27031 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:10:52.795735 27031 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:10:52.798259 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 27031
I20251212 21:10:52.806232 27176 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:10:52.816589 27177 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:10:52.816800 27177 heartbeater.cc:461] Registering TS with master...
I20251212 21:10:52.817090 27177 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:52.818078 25116 ts_manager.cc:194] Re-registered known tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:10:52.818547 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:50809
I20251212 21:10:52.824512 27064 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:10:53.002224 26810 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:53.005652 26961 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:53.013433 27108 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:53.014048 26692 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:53.028865 27064 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 1/1 log segments. Stats: ops{read=839 overwritten=0 applied=838 ignored=0} inserts{seen=41750 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:10:53.029341 27064 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap complete.
I20251212 21:10:53.030551 27064 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 0.243s	user 0.180s	sys 0.045s
I20251212 21:10:53.031685 27064 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:53.032769 27064 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:10:53.032930 27064 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 838, Last appended: 4.839, Last appended by leader: 839, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:53.033164 27064 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.002s	user 0.002s	sys 0.000s
I20251212 21:10:53.033293 27177 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:10:53.036615 27120 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 6 FOLLOWER]: Advancing to term 7
I20251212 21:10:53.038955 27120 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 7 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 4 index: 839. Preceding OpId from leader: term: 4 index: 842. (index mismatch)
I20251212 21:10:53.039422 27208 consensus_queue.cc:1050] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 843, Last known committed idx: 838, Time since last communication: 0.000s
I20251212 21:10:53.068634 27218 mvcc.cc:204] Tried to move back new op lower bound from 7231790502042427392 to 7231790500979388416. Current Snapshot: MvccSnapshot[applied={T|T < 7231790501830709248 or (T in {7231790501845254144,7231790501845725184,7231790501846036480,7231790501857095680,7231790501857587200,7231790501857968128,7231790501864468480,7231790501869199360,7231790501872693248,7231790501876207616,7231790501882970112})}]
I20251212 21:10:53.658445 27028 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
W20251212 21:10:53.720352 25745 scanner-internal.cc:458] Time spent opening tablet: real 2.406s	user 0.001s	sys 0.000s
W20251212 21:10:53.723825 25743 scanner-internal.cc:458] Time spent opening tablet: real 2.407s	user 0.001s	sys 0.000s
W20251212 21:10:53.732003 25744 scanner-internal.cc:458] Time spent opening tablet: real 2.406s	user 0.000s	sys 0.001s
I20251212 21:10:58.325899 26810 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251212 21:10:58.330386 26961 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:58.331489 27108 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:58.362020 26692 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:58.619952 25114 ts_manager.cc:284] Unset tserver state for 35867ec45b8041d48fc8c7bb132375c5 from MAINTENANCE_MODE
I20251212 21:10:58.629889 25114 ts_manager.cc:284] Unset tserver state for dd9e48fb810447718c09aca5a01b0fe3 from MAINTENANCE_MODE
I20251212 21:10:58.662014 27028 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:58.724476 25114 ts_manager.cc:284] Unset tserver state for 3d78cc34680848ddacf9620033efe712 from MAINTENANCE_MODE
I20251212 21:10:58.754942 25114 ts_manager.cc:284] Unset tserver state for c941504e89314a6a868d59585d254b81 from MAINTENANCE_MODE
I20251212 21:10:58.790904 26894 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:59.057405 27177 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:59.073139 25114 ts_manager.cc:295] Set tserver state for 3d78cc34680848ddacf9620033efe712 to MAINTENANCE_MODE
I20251212 21:10:59.078685 26758 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:10:59.183897 25114 ts_manager.cc:295] Set tserver state for dd9e48fb810447718c09aca5a01b0fe3 to MAINTENANCE_MODE
I20251212 21:10:59.211014 25114 ts_manager.cc:295] Set tserver state for c941504e89314a6a868d59585d254b81 to MAINTENANCE_MODE
I20251212 21:10:59.237121 25114 ts_manager.cc:295] Set tserver state for 35867ec45b8041d48fc8c7bb132375c5 to MAINTENANCE_MODE
I20251212 21:10:59.378372 26961 tablet_service.cc:1460] Tablet server 3d78cc34680848ddacf9620033efe712 set to quiescing
I20251212 21:10:59.378453 26961 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:10:59.452339 26810 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:10:59.452425 26810 tablet_service.cc:1467] Tablet server has 0 leaders and 3 scanners
I20251212 21:10:59.598799 26692 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:10:59.598876 26692 tablet_service.cc:1467] Tablet server has 1 leaders and 0 scanners
I20251212 21:10:59.600041 27099 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: : Instructing follower c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:10:59.600114 27099 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 LEADER]: Signalling peer c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:10:59.600769 27121 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "c941504e89314a6a868d59585d254b81"
 from {username='slave'} at 127.23.110.130:35531
I20251212 21:10:59.600883 27121 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 7 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:10:59.600942 27121 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 7 FOLLOWER]: Advancing to term 8
I20251212 21:10:59.601872 27121 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 8 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:59.602150 27121 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 8 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:10:59.602258 27120 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 8 FOLLOWER]: Rejecting Update request from peer dd9e48fb810447718c09aca5a01b0fe3 for earlier term 7. Current term is 8. Ops: []
I20251212 21:10:59.602928 26900 consensus_queue.cc:1059] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: INVALID_TERM, Last received: 7.7224, Next index: 7225, Last known committed idx: 7222, Time since last communication: 0.000s
I20251212 21:10:59.603065 26900 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 LEADER]: Stepping down as leader of term 7
I20251212 21:10:59.603094 26900 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 LEADER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:10:59.603137 26900 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 7225, Committed index: 7225, Last appended: 7.7226, Last appended by leader: 7226, Current term: 7, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:10:59.603227 26900 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 7 FOLLOWER]: Advancing to term 8
W20251212 21:10:59.603262 26670 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.604046 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.606130 26807 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:59.607110 26712 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 8 candidate_status { last_received { term: 7 index: 7224 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:10:59.607235 26712 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 8 because replica has last-logged OpId of term: 7 index: 7226, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 7224.
W20251212 21:10:59.607542 26808 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:59.608072 26838 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 8 candidate_status { last_received { term: 7 index: 7224 } } ignore_live_leader: true dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:10:59.608141 26838 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 7 FOLLOWER]: Advancing to term 8
I20251212 21:10:59.608774 26838 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 8 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 8 because replica has last-logged OpId of term: 7 index: 7226, which is greater than that of the candidate, which has last-logged OpId of term: 7 index: 7224.
I20251212 21:10:59.609104 27049 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 8 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:10:59.609462 27414 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 8 FOLLOWER]: Leader election lost for term 8. Reason: could not achieve majority
I20251212 21:10:59.610186 27108 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:10:59.610234 27108 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:10:59.611862 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.615873 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.617408 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.620311 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.623070 26790 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.627979 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.633071 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.637662 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.643749 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.644551 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.654405 26808 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.656082 26807 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:10:59.663756 27028 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
W20251212 21:10:59.666471 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.668612 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.678084 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.678555 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.688578 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.691099 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.701588 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.702672 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.715620 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.718160 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.733415 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.736095 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.751325 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.751322 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.767086 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.768464 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.785184 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.788131 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.807315 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.809568 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.826889 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.830531 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.850497 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.855424 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.874008 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.881430 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.900285 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.905298 27416 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:59.905298 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.924376 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.932303 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.951509 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.952030 26900 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:10:59.958886 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.979035 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.983824 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:10:59.995770 27414 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:00.009075 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.009696 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.037034 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.037758 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.064857 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.067044 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.093768 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.097859 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.126163 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.130932 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.159098 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.165376 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.192157 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.198187 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.228550 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.234262 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.266947 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.270038 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.302416 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.309623 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.339094 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.349722 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.376963 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.390405 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.417143 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.430190 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.459805 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.469524 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.500595 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.511065 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.541743 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.551971 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.585420 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.597906 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:00.601742 26810 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:11:00.601807 26810 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:00.629228 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.644680 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.675495 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.691650 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.723207 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.736862 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:00.746263 26692 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:00.746328 26692 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:00.773129 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.787513 26672 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:33392: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:00.801697 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 26629
W20251212 21:11:00.808430 25716 meta_cache.cc:302] tablet edaf6af028f1466d9dccb7d78cf88122: replica dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255) has failed: Network error: recv got EOF from 127.23.110.130:36255 (error 108)
I20251212 21:11:00.808786 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:36255
--local_ip_for_outbound_sockets=127.23.110.130
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=40621
--webserver_interface=127.23.110.130
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:00.823282 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.836369 26806 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:45880: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.884588 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.892141 27439 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:00.892338 27439 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:00.892364 27439 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:00.893952 27439 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:00.894016 27439 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:11:00.895612 27439 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:36255
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=40621
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27439
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:00.895830 27439 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:00.896034 27439 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:00.898947 27448 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:00.899021 27445 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:00.898954 27439 server_base.cc:1047] running on GCE node
W20251212 21:11:00.898954 27446 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:00.899325 27439 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:00.899538 27439 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:00.900692 27439 hybrid_clock.cc:648] HybridClock initialized: now 1765573860900677 us; error 27 us; skew 500 ppm
I20251212 21:11:00.901979 27439 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:11:00.902189 27439 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:00.902235 27439 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:00.903414 27439 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:11:00.904124 27454 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:00.904294 27439 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.001s
I20251212 21:11:00.904366 27439 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:00.904662 27439 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:00.916934 27439 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:00.917212 27439 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:00.917344 27439 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:00.917551 27439 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:00.918005 27461 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:00.918962 27439 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:00.919015 27439 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:00.919052 27439 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:00.919615 27439 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:00.919646 27439 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:00.919716 27461 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:11:00.925477 27439 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:11:00.925539 27568 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:11:00.925864 27439 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
W20251212 21:11:00.926685 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:00.931253 27569 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:00.931375 27569 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:00.931594 27569 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:00.932148 25114 ts_manager.cc:194] Re-registered known tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:00.932673 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:39305
I20251212 21:11:00.933485 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 27439
I20251212 21:11:00.933580 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 26762
I20251212 21:11:00.944779 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:42339
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34247
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:00.957024 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:00.980787 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:00.992947 27461 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:01.001802 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:01.034289 25743 meta_cache.cc:1510] marking tablet server 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339) as failed
W20251212 21:11:01.043808 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:01.044761 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:01.064117 27574 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:01.064383 27574 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:01.064422 27574 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:01.066906 27574 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:01.067010 27574 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:11:01.069335 25745 meta_cache.cc:1510] marking tablet server 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339) as failed
I20251212 21:11:01.069757 27574 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:42339
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=34247
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27574
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:01.070039 27574 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:01.070326 27574 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:01.073586 27580 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:01.073586 27581 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:01.073746 27574 server_base.cc:1047] running on GCE node
W20251212 21:11:01.073745 27583 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:01.074066 27574 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:01.074278 27574 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:01.075431 27574 hybrid_clock.cc:648] HybridClock initialized: now 1765573861075413 us; error 32 us; skew 500 ppm
I20251212 21:11:01.076727 27574 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:11:01.076929 27574 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:01.076972 27574 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:01.078305 27574 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
W20251212 21:11:01.078451 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:01.079105 27589 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:01.079275 27574 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:11:01.079355 27574 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:01.079725 27574 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:11:01.082638 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:01.088626 27574 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:01.088884 27574 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:01.088981 27574 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:01.089200 27574 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:01.089702 27596 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:01.090866 27574 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:01.090930 27574 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:01.090971 27574 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:01.091579 27574 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:01.091621 27574 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:01.091697 27596 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
I20251212 21:11:01.097404 27574 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:11:01.097764 27574 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:11:01.100919 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 27574
I20251212 21:11:01.101023 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 26898
I20251212 21:11:01.107148 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:41841
--local_ip_for_outbound_sockets=127.23.110.132
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34523
--webserver_interface=127.23.110.132
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:01.109642 27703 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
W20251212 21:11:01.115129 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:01.118654 27704 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:01.118791 27704 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:01.119011 27704 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:01.119594 25114 ts_manager.cc:194] Re-registered known tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:11:01.120134 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:51913
I20251212 21:11:01.160171 27596 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:01.177695 27083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58272: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:01.196565 27706 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:01.196769 27706 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:01.196800 27706 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:01.198756 27706 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:01.198860 27706 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:11:01.201128 27706 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:41841
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=34523
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27706
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:01.201665 27706 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:01.202029 27706 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:01.204627 27714 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:01.204697 27715 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:01.204897 27717 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:01.205451 27706 server_base.cc:1047] running on GCE node
I20251212 21:11:01.205677 27706 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:01.205984 27706 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:01.207227 27706 hybrid_clock.cc:648] HybridClock initialized: now 1765573861207207 us; error 32 us; skew 500 ppm
I20251212 21:11:01.208642 27706 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:11:01.208866 27706 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:01.208914 27706 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:01.210350 27706 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:01.210983 27723 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:01.211145 27706 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:11:01.211218 27706 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:01.211503 27706 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:01.230516 27706 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:01.230782 27706 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:01.230890 27706 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:01.231103 27706 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:01.231423 27706 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:11:01.231458 27706 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:01.231489 27706 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:11:01.231552 27706 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:01.238157 27706 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:11:01.238225 27836 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:11:01.238580 27706 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:11:01.242569 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 27706
I20251212 21:11:01.242682 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 27031
W20251212 21:11:01.249787 25717 connection.cc:537] client connection to 127.23.110.131:33221 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251212 21:11:01.250355 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:33221
--local_ip_for_outbound_sockets=127.23.110.131
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=43311
--webserver_interface=127.23.110.131
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:01.251036 27837 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:01.251137 27837 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:01.251367 27837 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:01.251930 25114 ts_manager.cc:194] Re-registered known tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:11:01.252315 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:51779
W20251212 21:11:01.334383 27840 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:01.334574 27840 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:01.334604 27840 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:01.336268 27840 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:01.336344 27840 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:11:01.338140 27840 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:33221
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=43311
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.27840
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:01.338423 27840 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:01.338662 27840 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:01.341267 27846 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:01.341475 27840 server_base.cc:1047] running on GCE node
W20251212 21:11:01.341269 27848 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:01.345342 27845 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:01.345644 27840 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:01.345943 27840 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:01.347091 27840 hybrid_clock.cc:648] HybridClock initialized: now 1765573861347087 us; error 36 us; skew 500 ppm
I20251212 21:11:01.348345 27840 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:11:01.348582 27840 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:01.348637 27840 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:01.349851 27840 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:01.350492 27854 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:01.350656 27840 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:11:01.350730 27840 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:01.350998 27840 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:01.365113 27840 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:01.365415 27840 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:01.365525 27840 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:01.365754 27840 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:01.366221 27861 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:01.367074 27840 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:01.367122 27840 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:01.367175 27840 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:01.367710 27840 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:01.367805 27840 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:01.367816 27861 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:11:01.375438 27840 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:11:01.375950 27840 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:11:01.376379 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 27840
I20251212 21:11:01.385596 27968 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:11:01.407615 27970 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:01.407743 27970 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:01.407966 27970 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:01.408596 25114 ts_manager.cc:194] Re-registered known tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:01.409142 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:49767
I20251212 21:11:01.436712 27861 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:11:01.575166 27619 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:01.583645 27903 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:01.585188 27771 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:01.608115 27503 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:01.933701 27569 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:02.002705 27461 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 1/2 log segments. Stats: ops{read=4625 overwritten=0 applied=4623 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:02.121089 27704 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:02.253324 27837 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:02.333184 27596 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 1/2 log segments. Stats: ops{read=4840 overwritten=0 applied=4837 ignored=0} inserts{seen=241650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:02.410072 27970 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:02.686213 27461 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 2/2 log segments. Stats: ops{read=7226 overwritten=0 applied=7225 ignored=0} inserts{seen=361050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:11:02.686705 27461 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap complete.
I20251212 21:11:02.690493 27461 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 1.771s	user 1.482s	sys 0.280s
I20251212 21:11:02.692126 27461 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:02.692879 27461 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:11:02.693086 27461 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7225, Last appended: 7.7226, Last appended by leader: 7226, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:02.693349 27461 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.003s	user 0.003s	sys 0.001s
I20251212 21:11:02.715754 27861 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 1/2 log segments. Stats: ops{read=4844 overwritten=0 applied=4840 ignored=0} inserts{seen=241800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 4 replicates
W20251212 21:11:02.761277 27483 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55376: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:02.767738 27596 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 2/2 log segments. Stats: ops{read=7226 overwritten=0 applied=7225 ignored=0} inserts{seen=361050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:11:02.768187 27596 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap complete.
I20251212 21:11:02.770891 27596 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 1.679s	user 1.415s	sys 0.231s
I20251212 21:11:02.771981 27596 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 8 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:02.772584 27596 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:11:02.772727 27596 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7225, Last appended: 7.7226, Last appended by leader: 7226, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:02.772943 27596 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.002s	user 0.003s	sys 0.001s
W20251212 21:11:02.839540 27618 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35008: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:02.848244 27483 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55376: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.000380 27483 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:55376: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.012148 27618 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35008: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:03.029994 28009 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:03.030114 28009 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:03.030426 28009 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 9 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:03.034478 27658 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 9 candidate_status { last_received { term: 7 index: 7226 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:03.033969 27923 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 9 candidate_status { last_received { term: 7 index: 7226 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81" is_pre_election: true
I20251212 21:11:03.034659 27658 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 8 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 8.
W20251212 21:11:03.034893 27458 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 9 pre-election: Tablet error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:03.035041 27456 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 9 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:11:03.035199 28009 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Leader pre-election won for term 9
I20251212 21:11:03.035251 28009 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:11:03.035280 28009 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 8 FOLLOWER]: Advancing to term 9
I20251212 21:11:03.036207 28009 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:03.036345 28009 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 9 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:03.036576 27923 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 9 candidate_status { last_received { term: 7 index: 7226 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:11:03.036561 27658 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 9 candidate_status { last_received { term: 7 index: 7226 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:11:03.036649 27658 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 8 FOLLOWER]: Advancing to term 9
W20251212 21:11:03.036769 27458 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 9 election: Tablet error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:03.037599 27658 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 9 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 9.
I20251212 21:11:03.037767 27456 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 9 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:11:03.037885 28009 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 FOLLOWER]: Leader election won for term 9
I20251212 21:11:03.038043 28009 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 LEADER]: Becoming Leader. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:03.038177 28009 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7225, Committed index: 7225, Last appended: 7.7226, Last appended by leader: 7226, Current term: 9, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:03.038829 25114 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 reported cstate change: term changed from 7 to 9. New cstate: current_term: 9 leader_uuid: "dd9e48fb810447718c09aca5a01b0fe3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: HEALTHY } } }
W20251212 21:11:03.081385 27618 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35008: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.094746 27458 consensus_peers.cc:597] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 -> Peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Couldn't send request to peer c941504e89314a6a868d59585d254b81. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20251212 21:11:03.094766 27658 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 9 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 7 index: 7226. Preceding OpId from leader: term: 9 index: 7228. (index mismatch)
I20251212 21:11:03.095048 28009 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 7227, Last known committed idx: 7225, Time since last communication: 0.000s
I20251212 21:11:03.096587 28008 mvcc.cc:204] Tried to move back new op lower bound from 7231790543233609728 to 7231790543004868608. Current Snapshot: MvccSnapshot[applied={T|T < 7231790528933142528}]
I20251212 21:11:03.097028 28010 mvcc.cc:204] Tried to move back new op lower bound from 7231790543233609728 to 7231790543004868608. Current Snapshot: MvccSnapshot[applied={T|T < 7231790528933142528}]
I20251212 21:11:03.148056 27861 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 2/2 log segments. Stats: ops{read=7224 overwritten=0 applied=7222 ignored=0} inserts{seen=360900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:03.148622 27861 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap complete.
I20251212 21:11:03.152083 27861 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 1.784s	user 1.532s	sys 0.228s
I20251212 21:11:03.153328 27861 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 8 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:03.154238 27861 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 8 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:11:03.154429 27861 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7222, Last appended: 7.7224, Last appended by leader: 7224, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:03.154722 27861 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.002s	user 0.000s	sys 0.002s
W20251212 21:11:03.165467 27876 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58308: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.166060 27876 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58308: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.166581 27876 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58308: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.169267 27618 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35008: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:03.172333 27618 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35008: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:03.201890 27923 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 8 FOLLOWER]: Advancing to term 9
I20251212 21:11:03.203295 27923 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 9 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 7 index: 7224. Preceding OpId from leader: term: 7 index: 7226. (index mismatch)
I20251212 21:11:03.203774 28009 consensus_queue.cc:1050] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 7227, Last known committed idx: 7222, Time since last communication: 0.000s
I20251212 21:11:03.215224 28029 mvcc.cc:204] Tried to move back new op lower bound from 7231790543678033920 to 7231790543004868608. Current Snapshot: MvccSnapshot[applied={T|T < 7231790543275212800 or (T in {7231790543281520640,7231790543283699712,7231790543292841984,7231790543297933312,7231790543305904128,7231790543312801792,7231790543316344832,7231790543320739840,7231790543326281728,7231790543329771520,7231790543333584896,7231790543337902080,7231790543342862336,7231790543289012224,7231790543348727808,7231790543352000512,7231790543307030528,7231790543357558784,7231790543366549504,7231790543363342336,7231790543375872000,7231790543371874304,7231790543381254144,7231790543385067520,7231790543391162368,7231790543395536896,7231790543399391232,7231790543403872256})}]
W20251212 21:11:03.688119 25744 scanner-internal.cc:458] Time spent opening tablet: real 3.807s	user 0.001s	sys 0.000s
W20251212 21:11:03.836768 25743 scanner-internal.cc:458] Time spent opening tablet: real 4.006s	user 0.001s	sys 0.001s
W20251212 21:11:03.871887 25745 scanner-internal.cc:458] Time spent opening tablet: real 4.007s	user 0.001s	sys 0.001s
I20251212 21:11:06.849471 27619 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:06.849923 27771 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:06.855913 27903 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:06.862630 27503 tablet_service.cc:1467] Tablet server has 1 leaders and 2 scanners
I20251212 21:11:07.185505 25114 ts_manager.cc:284] Unset tserver state for dd9e48fb810447718c09aca5a01b0fe3 from MAINTENANCE_MODE
I20251212 21:11:07.211583 27970 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:07.213913 25116 ts_manager.cc:284] Unset tserver state for c941504e89314a6a868d59585d254b81 from MAINTENANCE_MODE
I20251212 21:11:07.221091 25114 ts_manager.cc:284] Unset tserver state for 3d78cc34680848ddacf9620033efe712 from MAINTENANCE_MODE
I20251212 21:11:07.225660 27569 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:07.234444 25114 ts_manager.cc:284] Unset tserver state for 35867ec45b8041d48fc8c7bb132375c5 from MAINTENANCE_MODE
I20251212 21:11:07.257473 27837 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:07.558418 25114 ts_manager.cc:295] Set tserver state for c941504e89314a6a868d59585d254b81 to MAINTENANCE_MODE
I20251212 21:11:07.799566 27903 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:11:07.799628 27903 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:07.815074 25114 ts_manager.cc:295] Set tserver state for 3d78cc34680848ddacf9620033efe712 to MAINTENANCE_MODE
I20251212 21:11:07.831920 25114 ts_manager.cc:295] Set tserver state for dd9e48fb810447718c09aca5a01b0fe3 to MAINTENANCE_MODE
I20251212 21:11:07.892068 25114 ts_manager.cc:295] Set tserver state for 35867ec45b8041d48fc8c7bb132375c5 to MAINTENANCE_MODE
I20251212 21:11:08.073282 27503 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:08.073351 27503 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251212 21:11:08.073917 28193 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: : Instructing follower 35867ec45b8041d48fc8c7bb132375c5 to start an election
I20251212 21:11:08.073974 28193 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 LEADER]: Signalling peer 35867ec45b8041d48fc8c7bb132375c5 to start an election
I20251212 21:11:08.074254 27657 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
 from {username='slave'} at 127.23.110.130:51065
I20251212 21:11:08.074335 27657 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 9 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:11:08.074360 27657 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 9 FOLLOWER]: Advancing to term 10
I20251212 21:11:08.075186 27657 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:08.075448 27657 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 10 election: Requested vote from peers c941504e89314a6a868d59585d254b81 (127.23.110.131:33221), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:08.077483 27657 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Rejecting Update request from peer dd9e48fb810447718c09aca5a01b0fe3 for earlier term 9. Current term is 10. Ops: [9.11776-9.11777]
I20251212 21:11:08.078054 28193 consensus_queue.cc:1059] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: INVALID_TERM, Last received: 9.11775, Next index: 11776, Last known committed idx: 11775, Time since last communication: 0.000s
I20251212 21:11:08.078192 28016 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 LEADER]: Stepping down as leader of term 9
I20251212 21:11:08.078248 28016 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 LEADER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:08.078287 28016 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 11775, Committed index: 11775, Last appended: 9.11778, Last appended by leader: 11778, Current term: 9, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:08.078378 28016 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 9 FOLLOWER]: Advancing to term 10
I20251212 21:11:08.083300 27922 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 10 candidate_status { last_received { term: 9 index: 11775 } } ignore_live_leader: true dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:11:08.083403 27922 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 9 FOLLOWER]: Advancing to term 10
I20251212 21:11:08.084206 27922 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 35867ec45b8041d48fc8c7bb132375c5 for term 10 because replica has last-logged OpId of term: 9 index: 11778, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 11775.
I20251212 21:11:08.084693 27523 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 10 candidate_status { last_received { term: 9 index: 11775 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:11:08.084828 27523 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Leader election vote request: Denying vote to candidate 35867ec45b8041d48fc8c7bb132375c5 for term 10 because replica has last-logged OpId of term: 9 index: 11778, which is greater than that of the candidate, which has last-logged OpId of term: 9 index: 11775.
I20251212 21:11:08.085008 27592 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 10 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5; no voters: c941504e89314a6a868d59585d254b81, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:08.097347 28209 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Leader election lost for term 10. Reason: could not achieve majority
I20251212 21:11:08.101531 27704 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:08.107254 27771 tablet_service.cc:1460] Tablet server 3d78cc34680848ddacf9620033efe712 set to quiescing
I20251212 21:11:08.107344 27771 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:08.142386 27619 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:11:08.142465 27619 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:08.214931 27970 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:08.226701 27569 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
W20251212 21:11:08.377331 28016 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:08.379675 28230 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:08.514099 28209 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: failed to trigger leader election: Illegal state: leader elections are disabled
I20251212 21:11:09.219826 27503 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:09.219905 27503 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:09.275214 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 27439
W20251212 21:11:09.283811 25716 connection.cc:537] client connection to 127.23.110.130:36255 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20251212 21:11:09.283926 25716 meta_cache.cc:302] tablet edaf6af028f1466d9dccb7d78cf88122: replica dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255) has failed: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251212 21:11:09.283950 25745 meta_cache.cc:1510] marking tablet server dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255) as failed
I20251212 21:11:09.284193 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:36255
--local_ip_for_outbound_sockets=127.23.110.130
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=40621
--webserver_interface=127.23.110.130
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:09.286326 27618 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:35008: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:09.363179 28242 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:09.363366 28242 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:09.363395 28242 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:09.364863 28242 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:09.364933 28242 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:11:09.366547 28242 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:36255
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=40621
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28242
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:09.366788 28242 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:09.367014 28242 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:09.369704 28249 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.369797 28242 server_base.cc:1047] running on GCE node
W20251212 21:11:09.369699 28248 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:09.369725 28251 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.370198 28242 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:09.370434 28242 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:09.371596 28242 hybrid_clock.cc:648] HybridClock initialized: now 1765573869371558 us; error 51 us; skew 500 ppm
I20251212 21:11:09.372776 28242 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:11:09.372972 28242 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:09.373014 28242 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:09.374210 28242 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.375001 28257 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.375188 28242 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.375257 28242 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:09.375546 28242 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:09.383632 28242 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:09.383896 28242 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:09.383996 28242 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:09.384195 28242 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:09.384609 28264 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:09.385530 28242 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:09.385592 28242 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:09.385633 28242 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:09.386194 28242 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:09.386229 28242 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:09.386307 28264 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:11:09.393028 28242 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:11:09.393074 28371 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:11:09.393456 28242 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:11:09.398614 28372 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:09.398721 28372 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:09.398912 28372 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:09.399129 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 28242
I20251212 21:11:09.399225 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 27574
I20251212 21:11:09.400038 25114 ts_manager.cc:194] Re-registered known tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:09.400547 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:43675
I20251212 21:11:09.407923 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:42339
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34247
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:09.436767 28264 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:09.490173 28376 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:09.490343 28376 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:09.490363 28376 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:09.492040 28376 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:09.492103 28376 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:11:09.493764 28376 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:42339
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=34247
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28376
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:09.493975 28376 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:09.494189 28376 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:09.496696 28383 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:09.496762 28382 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.496840 28376 server_base.cc:1047] running on GCE node
W20251212 21:11:09.496696 28385 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.497113 28376 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:09.497331 28376 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:09.498466 28376 hybrid_clock.cc:648] HybridClock initialized: now 1765573869498451 us; error 29 us; skew 500 ppm
I20251212 21:11:09.499583 28376 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:11:09.499773 28376 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:09.499816 28376 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:09.500988 28376 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.501708 28391 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.501869 28376 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.501936 28376 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:09.502183 28376 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:09.512586 28376 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:09.512826 28376 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:09.512926 28376 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:09.513121 28376 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:09.513607 28398 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:09.514770 28376 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:09.514828 28376 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:09.514865 28376 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:09.515376 28376 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:09.515406 28376 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:09.515560 28398 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
I20251212 21:11:09.522598 28376 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:11:09.522653 28505 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
I20251212 21:11:09.523026 28376 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:11:09.525590 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 28376
I20251212 21:11:09.525712 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 27706
I20251212 21:11:09.532019 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:41841
--local_ip_for_outbound_sockets=127.23.110.132
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34523
--webserver_interface=127.23.110.132
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:09.537308 28506 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:09.537534 28506 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:09.537822 28506 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:09.538463 25114 ts_manager.cc:194] Re-registered known tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:11:09.538995 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:56627
I20251212 21:11:09.585431 28398 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:09.618942 28509 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:09.619113 28509 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:09.619144 28509 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:09.620641 28509 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:09.620713 28509 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:11:09.622375 28509 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:41841
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=34523
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28509
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:09.622637 28509 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:09.622884 28509 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:09.625401 28516 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:09.625424 28517 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:09.625605 28519 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.626076 28509 server_base.cc:1047] running on GCE node
I20251212 21:11:09.626240 28509 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:09.626436 28509 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:09.627638 28509 hybrid_clock.cc:648] HybridClock initialized: now 1765573869627624 us; error 27 us; skew 500 ppm
I20251212 21:11:09.628767 28509 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:11:09.628993 28509 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:09.629037 28509 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:09.630244 28509 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.630810 28525 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.630985 28509 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:11:09.631048 28509 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:09.631319 28509 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:09.644610 28509 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:09.644867 28509 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:09.644970 28509 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:09.645174 28509 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:09.645531 28509 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:11:09.645568 28509 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.645598 28509 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:11:09.645648 28509 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.652143 28509 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:11:09.652184 28638 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:11:09.652586 28509 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:11:09.659402 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 28509
I20251212 21:11:09.659509 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 27840
I20251212 21:11:09.664309 28639 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:09.664424 28639 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:09.664647 28639 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:09.665112 25114 ts_manager.cc:194] Re-registered known tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:11:09.665638 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:42565
W20251212 21:11:09.670938 25717 connection.cc:537] client connection to 127.23.110.131:33221 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20251212 21:11:09.671198 25743 meta_cache.cc:1510] marking tablet server c941504e89314a6a868d59585d254b81 (127.23.110.131:33221) as failed
I20251212 21:11:09.671341 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:33221
--local_ip_for_outbound_sockets=127.23.110.131
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=43311
--webserver_interface=127.23.110.131
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:09.688741 25745 meta_cache.cc:1510] marking tablet server c941504e89314a6a868d59585d254b81 (127.23.110.131:33221) as failed
W20251212 21:11:09.789232 28642 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:09.789485 28642 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:09.789523 28642 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:09.791970 28642 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:09.792065 28642 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:11:09.794675 28642 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:33221
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=43311
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.28642
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:09.794977 28642 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:09.795261 28642 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:09.798338 28651 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:09.798339 28649 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.798795 28642 server_base.cc:1047] running on GCE node
W20251212 21:11:09.801380 28648 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:09.801682 28642 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:09.801957 28642 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:09.803114 28642 hybrid_clock.cc:648] HybridClock initialized: now 1765573869803088 us; error 41 us; skew 500 ppm
I20251212 21:11:09.804636 28642 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:11:09.804863 28642 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:09.804925 28642 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:09.806667 28642 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.807502 28657 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:09.807663 28642 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.807744 28642 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:09.808064 28642 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:09.834787 28642 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:09.835091 28642 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:09.835203 28642 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:09.835456 28642 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:09.835984 28664 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:09.837339 28642 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:09.837436 28642 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.002s	user 0.000s	sys 0.000s
I20251212 21:11:09.837518 28642 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:09.838277 28642 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:09.838366 28642 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:09.838392 28664 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:11:09.845006 28642 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:11:09.845542 28642 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:11:09.849654 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 28642
I20251212 21:11:09.857949 28771 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:11:09.876789 28772 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:09.876899 28772 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:09.877089 28772 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:09.877578 25116 ts_manager.cc:194] Re-registered known tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:09.878110 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:47237
I20251212 21:11:09.915206 28664 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:11:10.023384 28422 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:10.034497 28706 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:10.047242 28573 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:10.055625 28306 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:10.401577 28372 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:10.539956 28506 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:10.572301 28264 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 1/3 log segments. Stats: ops{read=4625 overwritten=0 applied=4623 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:10.666769 28639 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:10.740612 28664 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 1/3 log segments. Stats: ops{read=4625 overwritten=0 applied=4622 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:10.817617 28398 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 1/3 log segments. Stats: ops{read=4626 overwritten=0 applied=4623 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:10.878969 28772 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:11.582408 28664 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 2/3 log segments. Stats: ops{read=9365 overwritten=0 applied=9364 ignored=0} inserts{seen=467950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:11:11.893919 28264 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 2/3 log segments. Stats: ops{read=9247 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:12.035058 28664 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 3/3 log segments. Stats: ops{read=11778 overwritten=0 applied=11775 ignored=0} inserts{seen=588500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:12.035498 28664 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap complete.
I20251212 21:11:12.039252 28664 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 2.201s	user 1.885s	sys 0.294s
I20251212 21:11:12.040400 28664 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.041080 28664 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:11:12.041303 28664 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11775, Last appended: 9.11778, Last appended by leader: 11778, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.041543 28664 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.001s
I20251212 21:11:12.129590 28398 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 2/3 log segments. Stats: ops{read=9347 overwritten=0 applied=9346 ignored=0} inserts{seen=467050 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:11:12.370071 28813 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:12.370236 28813 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.370571 28813 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:12.374807 28460 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 11 candidate_status { last_received { term: 9 index: 11778 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:12.374892 28326 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 11 candidate_status { last_received { term: 9 index: 11778 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3" is_pre_election: true
W20251212 21:11:12.375947 28659 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339): Illegal state: must be running to vote when last-logged opid is not known
W20251212 21:11:12.376055 28660 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 11 pre-election: Tablet error from VoteRequest() call to peer dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:12.376122 28660 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:12.376262 28813 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Leader pre-election lost for term 11. Reason: could not achieve majority
I20251212 21:11:12.428035 28264 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 3/3 log segments. Stats: ops{read=11778 overwritten=0 applied=11775 ignored=0} inserts{seen=588500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:12.428454 28264 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap complete.
I20251212 21:11:12.432258 28264 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 3.046s	user 2.670s	sys 0.347s
I20251212 21:11:12.432796 28264 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Replica starting. Triggering 3 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.433454 28264 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:11:12.433584 28264 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11775, Last appended: 9.11778, Last appended by leader: 11778, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.433782 28264 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.001s	user 0.004s	sys 0.000s
I20251212 21:11:12.577507 28398 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 3/3 log segments. Stats: ops{read=11775 overwritten=0 applied=11775 ignored=0} inserts{seen=588500 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:11:12.577947 28398 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap complete.
I20251212 21:11:12.581683 28398 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 3.066s	user 2.696s	sys 0.351s
I20251212 21:11:12.582229 28398 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.582423 28398 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:11:12.582505 28398 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11775, Last appended: 9.11775, Last appended by leader: 11775, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.582737 28398 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:12.723395 28820 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:12.723558 28820 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.723862 28820 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 11 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:12.726536 28460 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 11 candidate_status { last_received { term: 9 index: 11778 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:12.726696 28460 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 10.
I20251212 21:11:12.726663 28713 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 11 candidate_status { last_received { term: 9 index: 11778 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81" is_pre_election: true
I20251212 21:11:12.726790 28713 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 10.
I20251212 21:11:12.727152 28259 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 11 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: 
I20251212 21:11:12.727281 28820 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Leader pre-election won for term 11
I20251212 21:11:12.727352 28820 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:11:12.727373 28820 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 10 FOLLOWER]: Advancing to term 11
I20251212 21:11:12.728286 28820 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.728418 28820 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 11 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:12.728600 28713 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 11 candidate_status { last_received { term: 9 index: 11778 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:11:12.728614 28460 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 11 candidate_status { last_received { term: 9 index: 11778 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:11:12.728673 28713 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 10 FOLLOWER]: Advancing to term 11
I20251212 21:11:12.728690 28460 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 10 FOLLOWER]: Advancing to term 11
I20251212 21:11:12.729585 28713 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 11.
I20251212 21:11:12.729736 28460 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 11 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 11.
I20251212 21:11:12.729797 28261 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 11 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: c941504e89314a6a868d59585d254b81, dd9e48fb810447718c09aca5a01b0fe3; no voters: 
I20251212 21:11:12.729916 28820 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 FOLLOWER]: Leader election won for term 11
I20251212 21:11:12.730082 28820 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 LEADER]: Becoming Leader. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:12.730183 28820 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11775, Committed index: 11775, Last appended: 9.11778, Last appended by leader: 11778, Current term: 11, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:12.730852 25114 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 reported cstate change: term changed from 9 to 11. New cstate: current_term: 11 leader_uuid: "dd9e48fb810447718c09aca5a01b0fe3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: HEALTHY } } }
I20251212 21:11:12.806120 28460 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 11 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 9 index: 11775. Preceding OpId from leader: term: 11 index: 11779. (index mismatch)
I20251212 21:11:12.806442 28820 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11779, Last known committed idx: 11775, Time since last communication: 0.000s
I20251212 21:11:12.808931 28713 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 11 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 9 index: 11778. Preceding OpId from leader: term: 11 index: 11779. (index mismatch)
I20251212 21:11:12.809373 28825 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11779, Last known committed idx: 11775, Time since last communication: 0.000s
W20251212 21:11:12.811334 28685 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:12.811334 28684 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:12.811556 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:12.813109 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:12.816144 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:12.817227 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:12.919003 25744 scanner-internal.cc:458] Time spent opening tablet: real 3.711s	user 0.001s	sys 0.000s
W20251212 21:11:13.286053 25743 scanner-internal.cc:458] Time spent opening tablet: real 4.022s	user 0.001s	sys 0.000s
W20251212 21:11:13.292351 25745 scanner-internal.cc:458] Time spent opening tablet: real 4.012s	user 0.001s	sys 0.001s
I20251212 21:11:15.291293 28573 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:15.293062 28422 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:15.298192 28306 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251212 21:11:15.304457 28706 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:15.689499 25116 ts_manager.cc:284] Unset tserver state for 3d78cc34680848ddacf9620033efe712 from MAINTENANCE_MODE
I20251212 21:11:15.690059 28639 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:15.695621 25114 ts_manager.cc:284] Unset tserver state for c941504e89314a6a868d59585d254b81 from MAINTENANCE_MODE
I20251212 21:11:15.730398 25114 ts_manager.cc:284] Unset tserver state for dd9e48fb810447718c09aca5a01b0fe3 from MAINTENANCE_MODE
I20251212 21:11:15.740796 25114 ts_manager.cc:284] Unset tserver state for 35867ec45b8041d48fc8c7bb132375c5 from MAINTENANCE_MODE
I20251212 21:11:15.811429 28506 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:15.814034 28772 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:15.823032 28372 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:16.029289 25114 ts_manager.cc:295] Set tserver state for 35867ec45b8041d48fc8c7bb132375c5 to MAINTENANCE_MODE
I20251212 21:11:16.147799 25114 ts_manager.cc:295] Set tserver state for dd9e48fb810447718c09aca5a01b0fe3 to MAINTENANCE_MODE
I20251212 21:11:16.196278 25114 ts_manager.cc:295] Set tserver state for 3d78cc34680848ddacf9620033efe712 to MAINTENANCE_MODE
I20251212 21:11:16.235646 25114 ts_manager.cc:295] Set tserver state for c941504e89314a6a868d59585d254b81 to MAINTENANCE_MODE
I20251212 21:11:16.252132 28422 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:11:16.252213 28422 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:16.406371 28306 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:16.406450 28306 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251212 21:11:16.407052 28844 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: : Instructing follower c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:11:16.407122 28844 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 LEADER]: Signalling peer c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:11:16.407439 28712 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "c941504e89314a6a868d59585d254b81"
 from {username='slave'} at 127.23.110.130:58063
I20251212 21:11:16.407538 28712 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 11 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:11:16.407565 28712 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 11 FOLLOWER]: Advancing to term 12
I20251212 21:11:16.407588 28897 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: : Instructing follower c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:11:16.407650 28897 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 LEADER]: Signalling peer c941504e89314a6a868d59585d254b81 to start an election
I20251212 21:11:16.407817 28713 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "c941504e89314a6a868d59585d254b81"
 from {username='slave'} at 127.23.110.130:58063
I20251212 21:11:16.408386 28712 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 12 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:16.408540 28711 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 12 FOLLOWER]: Rejecting Update request from peer dd9e48fb810447718c09aca5a01b0fe3 for earlier term 11. Current term is 12. Ops: [11.14961-11.14962]
I20251212 21:11:16.408612 28712 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 12 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:16.408826 28326 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 12 candidate_status { last_received { term: 11 index: 14960 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:11:16.408900 28326 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 LEADER]: Stepping down as leader of term 11
I20251212 21:11:16.408929 28326 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 LEADER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:16.408969 28326 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 14960, Committed index: 14960, Last appended: 11.14962, Last appended by leader: 14962, Current term: 11, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:16.408923 28713 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 12 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:11:16.409024 28713 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 12 FOLLOWER]: Advancing to term 13
I20251212 21:11:16.409034 28326 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 11 FOLLOWER]: Advancing to term 12
I20251212 21:11:16.409161 28460 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 12 candidate_status { last_received { term: 11 index: 14960 } } ignore_live_leader: true dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:11:16.409261 28460 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 11 FOLLOWER]: Advancing to term 12
I20251212 21:11:16.409760 28326 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 12 because replica has last-logged OpId of term: 11 index: 14962, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 14960.
I20251212 21:11:16.409723 28713 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 13 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:16.409907 28460 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 12 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 12 because replica has last-logged OpId of term: 11 index: 14962, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 14960.
I20251212 21:11:16.410045 28659 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 12 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:16.410218 29009 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 13 FOLLOWER]: Leader election lost for term 12. Reason: could not achieve majority
I20251212 21:11:16.410275 28713 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 13 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:16.410333 28326 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 13 candidate_status { last_received { term: 11 index: 14960 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
W20251212 21:11:16.410060 28844 consensus_queue.cc:1175] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue is closed or peer was untracked, disregarding peer response. Response: responder_uuid: "c941504e89314a6a868d59585d254b81" responder_term: 12 status { last_received { term: 11 index: 14960 } last_committed_idx: 14960 error { code: INVALID_TERM status { code: ILLEGAL_STATE message: "Rejecting Update request from peer dd9e48fb810447718c09aca5a01b0fe3 for earlier term 11. Current term is 12. Ops: [11.14961-11.14962]" } } last_received_current_leader { term: 0 index: 0 } }
I20251212 21:11:16.410533 28326 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 12 FOLLOWER]: Advancing to term 13
I20251212 21:11:16.410756 28460 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 13 candidate_status { last_received { term: 11 index: 14960 } } ignore_live_leader: true dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:11:16.410820 28460 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 12 FOLLOWER]: Advancing to term 13
I20251212 21:11:16.411209 28326 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 13 because replica has last-logged OpId of term: 11 index: 14962, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 14960.
I20251212 21:11:16.411432 28460 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 13 FOLLOWER]: Leader election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 13 because replica has last-logged OpId of term: 11 index: 14962, which is greater than that of the candidate, which has last-logged OpId of term: 11 index: 14960.
I20251212 21:11:16.411576 28659 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 13 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:16.412035 29009 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 13 FOLLOWER]: Leader election lost for term 13. Reason: could not achieve majority
W20251212 21:11:16.412103 28281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.414695 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.420308 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.426301 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.435112 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.443806 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.452400 28282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:16.461115 28706 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:11:16.461182 28706 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:16.463289 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:16.469482 28573 tablet_service.cc:1460] Tablet server 3d78cc34680848ddacf9620033efe712 set to quiescing
I20251212 21:11:16.469566 28573 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:16.476215 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.488255 28281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.500896 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.516472 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.532631 28281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.547194 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.565836 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.585817 28281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.602447 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.621116 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.642146 28282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.662822 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.669855 28844 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:16.685503 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:16.691628 28639 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
W20251212 21:11:16.707614 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.709450 29030 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:16.732211 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.759914 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.784936 28281 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.812562 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.841192 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.853319 29009 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:16.873307 28282 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.903944 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.936830 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:16.971283 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.006014 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.042856 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.080345 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.117338 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.153538 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.191905 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.229864 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.271858 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.312222 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.356156 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.398025 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.442325 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.489491 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.537779 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:17.553454 28306 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:17.553529 28306 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:17.586301 28283 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:43788: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:17.609130 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 28242
W20251212 21:11:17.619870 25716 meta_cache.cc:302] tablet edaf6af028f1466d9dccb7d78cf88122: replica dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255) has failed: Network error: recv got EOF from 127.23.110.130:36255 (error 108)
I20251212 21:11:17.620133 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:36255
--local_ip_for_outbound_sockets=127.23.110.130
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=40621
--webserver_interface=127.23.110.130
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:17.638018 28415 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:37602: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.687932 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.699021 29042 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:17.699203 29042 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:17.699231 29042 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:17.700656 29042 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:17.700716 29042 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:11:17.702297 29042 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:36255
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=40621
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29042
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:17.702526 29042 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:17.702734 29042 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:17.705276 29050 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:17.705298 29047 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:17.705345 29048 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:17.705547 29042 server_base.cc:1047] running on GCE node
I20251212 21:11:17.705832 29042 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:17.706023 29042 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:17.707151 29042 hybrid_clock.cc:648] HybridClock initialized: now 1765573877707141 us; error 27 us; skew 500 ppm
I20251212 21:11:17.708253 29042 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:11:17.708431 29042 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:17.708468 29042 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:17.709671 29042 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:17.710340 29056 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:17.710490 29042 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:11:17.710593 29042 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:17.710872 29042 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:17.724033 29042 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:17.724287 29042 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:17.724387 29042 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:17.724601 29042 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:17.725019 29063 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:17.725894 29042 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:17.725935 29042 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:17.725972 29042 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:17.726603 29042 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:17.726639 29042 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:17.726698 29063 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:11:17.732591 29042 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:11:17.732640 29170 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:11:17.732928 29042 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:11:17.734674 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 29042
I20251212 21:11:17.734791 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 28376
I20251212 21:11:17.741732 29171 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:17.741853 29171 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:17.742082 29171 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:17.742627 25114 ts_manager.cc:194] Re-registered known tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:17.743147 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:34991
I20251212 21:11:17.750255 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:42339
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34247
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:17.759150 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.762318 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:17.763118 29063 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:17.776103 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.787165 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.799299 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.802395 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.821645 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.833887 29177 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:17.834049 29177 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:17.834077 29177 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:17.835599 29177 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:17.835660 29177 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:11:17.837356 29177 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:42339
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=34247
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29177
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:17.837575 29177 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:17.837795 29177 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:17.838717 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.840322 29184 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:17.840349 29186 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:17.840349 29183 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:17.840755 29177 server_base.cc:1047] running on GCE node
I20251212 21:11:17.840934 29177 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:17.841166 29177 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:17.842321 29177 hybrid_clock.cc:648] HybridClock initialized: now 1765573877842303 us; error 29 us; skew 500 ppm
I20251212 21:11:17.843623 29177 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:11:17.843835 29177 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:17.843889 29177 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:17.845415 29177 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:17.846210 29192 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:17.846386 29177 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.001s	sys 0.000s
I20251212 21:11:17.846508 29177 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:17.846849 29177 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:11:17.867278 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:17.868724 29177 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:17.869031 29177 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:17.869211 29177 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:17.869529 29177 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:17.870018 29199 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:17.870883 29177 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:17.870939 29177 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:17.870973 29177 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:17.871693 29177 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:17.871737 29177 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:17.871847 29199 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
I20251212 21:11:17.878911 29177 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:11:17.879305 29177 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
I20251212 21:11:17.885316 29307 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:17.885416 29307 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:17.885450 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 29177
I20251212 21:11:17.885531 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 28509
I20251212 21:11:17.885627 29307 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:17.886232 25114 ts_manager.cc:194] Re-registered known tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:11:17.886725 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:57447
I20251212 21:11:17.891412 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:41841
--local_ip_for_outbound_sockets=127.23.110.132
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34523
--webserver_interface=127.23.110.132
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:17.893357 29306 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
I20251212 21:11:17.901827 29199 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:17.918751 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.934566 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.962613 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:17.978583 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:18.006438 29310 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:18.006675 29310 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:18.006708 29310 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:18.009016 29310 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:18.009097 29310 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:11:18.011554 29310 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:41841
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=34523
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29310
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:18.011809 29310 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:18.012079 29310 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:18.014997 29317 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:18.015055 29318 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:18.015317 29320 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:18.015615 29310 server_base.cc:1047] running on GCE node
I20251212 21:11:18.015789 29310 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:18.016003 29310 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:18.017164 29310 hybrid_clock.cc:648] HybridClock initialized: now 1765573878017142 us; error 35 us; skew 500 ppm
I20251212 21:11:18.018424 29310 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:11:18.018602 29310 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:18.018641 29310 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:18.019816 29310 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
W20251212 21:11:18.020185 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:18.020673 29326 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:18.020907 29310 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:18.020983 29310 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:18.021329 29310 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:11:18.022423 28686 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:60110: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:18.050319 29310 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:18.050622 29310 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:18.050755 29310 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:18.051013 29310 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:18.051396 29310 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:11:18.051440 29310 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:18.051476 29310 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:11:18.051507 29310 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:18.058629 29310 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:11:18.059015 29310 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:11:18.059422 29439 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:11:18.064502 29440 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:18.064644 29440 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:18.064919 29440 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:18.065354 25114 ts_manager.cc:194] Re-registered known tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:11:18.065838 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:48215
I20251212 21:11:18.068259 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 29310
I20251212 21:11:18.068352 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 28642
I20251212 21:11:18.081341 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:33221
--local_ip_for_outbound_sockets=127.23.110.131
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=43311
--webserver_interface=127.23.110.131
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:18.123683 25744 meta_cache.cc:1510] marking tablet server c941504e89314a6a868d59585d254b81 (127.23.110.131:33221) as failed
W20251212 21:11:18.196484 29443 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:18.196720 29443 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:18.196764 29443 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:18.199138 29443 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:18.199239 29443 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:11:18.201817 29443 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:33221
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=43311
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29443
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:18.202133 29443 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:18.202414 29443 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:18.205770 29448 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:18.205780 29451 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:18.205863 29449 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:18.206498 29443 server_base.cc:1047] running on GCE node
I20251212 21:11:18.206696 29443 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:18.206935 29443 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:18.208098 29443 hybrid_clock.cc:648] HybridClock initialized: now 1765573878208071 us; error 38 us; skew 500 ppm
I20251212 21:11:18.209566 29443 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:11:18.209806 29443 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:18.209860 29443 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:18.211408 29443 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.002s	sys 0.000s
I20251212 21:11:18.212250 29457 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:18.212487 29443 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:18.212571 29443 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:18.212945 29443 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:18.227423 29443 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:18.227710 29443 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:18.227842 29443 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:18.228093 29443 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:18.228602 29464 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:18.229908 29443 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:18.229969 29443 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.002s	user 0.000s	sys 0.000s
I20251212 21:11:18.230008 29443 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:18.230711 29443 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:18.230767 29443 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:18.230911 29464 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:11:18.237637 29443 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:11:18.238042 29443 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:11:18.246701 29571 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:11:18.247365 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 29443
I20251212 21:11:18.253964 29572 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:18.254063 29572 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:18.254251 29572 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:18.254853 25114 ts_manager.cc:194] Re-registered known tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:18.255367 25114 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:58191
I20251212 21:11:18.269282 29464 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:11:18.423961 29241 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:18.429073 29374 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:18.436340 29105 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:18.445333 29487 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:18.744246 29171 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:18.778822 29063 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 1/4 log segments. Stats: ops{read=4625 overwritten=0 applied=4623 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:18.887565 29307 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:18.964478 29199 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 1/4 log segments. Stats: ops{read=4626 overwritten=0 applied=4626 ignored=0} inserts{seen=231100 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:11:19.066718 29440 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:19.256266 29572 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:19.568720 29464 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 1/4 log segments. Stats: ops{read=4625 overwritten=0 applied=4622 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:19.629109 29063 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 2/4 log segments. Stats: ops{read=9247 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:20.291390 29199 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 2/4 log segments. Stats: ops{read=9247 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:20.547861 29063 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 3/4 log segments. Stats: ops{read=13868 overwritten=0 applied=13865 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:20.765487 29063 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 4/4 log segments. Stats: ops{read=14962 overwritten=0 applied=14960 ignored=0} inserts{seen=747700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:20.765955 29063 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap complete.
I20251212 21:11:20.771153 29063 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 3.045s	user 2.638s	sys 0.362s
I20251212 21:11:20.772428 29063 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:20.773128 29063 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:11:20.773332 29063 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14960, Last appended: 11.14962, Last appended by leader: 14962, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:20.773612 29063 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.000s
W20251212 21:11:20.792596 29085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:20.869475 29464 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 2/4 log segments. Stats: ops{read=9248 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
W20251212 21:11:20.926636 25744 scanner-internal.cc:458] Time spent opening tablet: real 4.007s	user 0.001s	sys 0.000s
W20251212 21:11:21.076010 29085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:21.094570 25743 scanner-internal.cc:458] Time spent opening tablet: real 4.006s	user 0.000s	sys 0.001s
W20251212 21:11:21.096264 25745 scanner-internal.cc:458] Time spent opening tablet: real 4.005s	user 0.001s	sys 0.000s
I20251212 21:11:21.099504 29617 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:21.099634 29617 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:21.099963 29617 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:21.122748 29261 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:21.123797 29519 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81" is_pre_election: true
W20251212 21:11:21.124996 29060 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Illegal state: must be running to vote when last-logged opid is not known
W20251212 21:11:21.125669 29058 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:21.125823 29058 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: dd9e48fb810447718c09aca5a01b0fe3; no voters: 35867ec45b8041d48fc8c7bb132375c5, c941504e89314a6a868d59585d254b81
I20251212 21:11:21.125962 29617 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Leader pre-election lost for term 14. Reason: could not achieve majority
W20251212 21:11:21.369812 29085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:21.496059 29617 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:21.496165 29617 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:21.496311 29617 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:21.496440 29261 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:21.496652 29519 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81" is_pre_election: true
W20251212 21:11:21.496683 29058 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339): Illegal state: must be running to vote when last-logged opid is not known
W20251212 21:11:21.496891 29060 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:21.496937 29060 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: dd9e48fb810447718c09aca5a01b0fe3; no voters: 35867ec45b8041d48fc8c7bb132375c5, c941504e89314a6a868d59585d254b81
I20251212 21:11:21.497288 29617 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Leader pre-election lost for term 14. Reason: could not achieve majority
I20251212 21:11:21.564595 29199 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 3/4 log segments. Stats: ops{read=13975 overwritten=0 applied=13974 ignored=0} inserts{seen=698400 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20251212 21:11:21.671975 29085 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:21.794500 29199 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 4/4 log segments. Stats: ops{read=14962 overwritten=0 applied=14960 ignored=0} inserts{seen=747700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:21.794983 29199 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap complete.
I20251212 21:11:21.800657 29199 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 3.929s	user 3.425s	sys 0.476s
I20251212 21:11:21.801673 29199 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 13 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:21.802340 29199 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 13 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:11:21.802523 29199 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14960, Last appended: 11.14962, Last appended by leader: 14962, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:21.802768 29199 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
W20251212 21:11:21.984364 29082 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:22.014554 29627 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:22.014736 29627 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:22.015022 29627 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:22.015156 29519 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81" is_pre_election: true
W20251212 21:11:22.015396 29060 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Tablet error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:22.015362 29261 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:22.015543 29261 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 13 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 13.
I20251212 21:11:22.015827 29058 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:11:22.015913 29627 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Leader pre-election won for term 14
I20251212 21:11:22.016006 29627 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:11:22.016094 29627 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 13 FOLLOWER]: Advancing to term 14
I20251212 21:11:22.017654 29627 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:22.017871 29627 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:22.018013 29519 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:11:22.018075 29261 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 14 candidate_status { last_received { term: 11 index: 14962 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:11:22.018160 29261 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 13 FOLLOWER]: Advancing to term 14
W20251212 21:11:22.018213 29060 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 election: Tablet error from VoteRequest() call to peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:22.019433 29261 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 14 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 14.
I20251212 21:11:22.019690 29058 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 14 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: c941504e89314a6a868d59585d254b81
I20251212 21:11:22.019827 29627 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 FOLLOWER]: Leader election won for term 14
I20251212 21:11:22.019982 29627 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 LEADER]: Becoming Leader. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:22.020172 29627 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 14960, Committed index: 14960, Last appended: 11.14962, Last appended by leader: 14962, Current term: 14, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:22.020929 25116 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 reported cstate change: term changed from 11 to 14. New cstate: current_term: 14 leader_uuid: "dd9e48fb810447718c09aca5a01b0fe3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: HEALTHY } } }
W20251212 21:11:22.092093 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:22.129132 29261 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 14 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 11 index: 14962. Preceding OpId from leader: term: 14 index: 14963. (index mismatch)
I20251212 21:11:22.129555 29627 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14963, Last known committed idx: 14960, Time since last communication: 0.000s
W20251212 21:11:22.131611 29060 consensus_peers.cc:597] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 -> Peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Couldn't send request to peer c941504e89314a6a868d59585d254b81. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 1: this message will repeat every 5th retry.
I20251212 21:11:22.132808 29635 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:58502 (ReqId={client: 52d578ba4b7943e2b18c17bec32f3fab, seq_no=14954, attempt_no=75}) took 1270 ms. Trace:
I20251212 21:11:22.133009 29636 rpcz_store.cc:275] Call kudu.tserver.TabletServerService.Write from 127.0.0.1:58502 (ReqId={client: 52d578ba4b7943e2b18c17bec32f3fab, seq_no=14955, attempt_no=75}) took 1284 ms. Trace:
I20251212 21:11:22.132992 29635 rpcz_store.cc:276] 1212 21:11:20.862267 (+     0us) service_pool.cc:168] Inserting onto call queue
1212 21:11:20.862310 (+    43us) service_pool.cc:225] Handling call
1212 21:11:22.132791 (+1270481us) inbound_call.cc:173] Queueing success response
Metrics: {}
I20251212 21:11:22.133080 29636 rpcz_store.cc:276] 1212 21:11:20.848062 (+     0us) service_pool.cc:168] Inserting onto call queue
1212 21:11:20.848105 (+    43us) service_pool.cc:225] Handling call
1212 21:11:22.133004 (+1284899us) inbound_call.cc:173] Queueing success response
Metrics: {}
W20251212 21:11:22.134363 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:22.134716 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:22.202108 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:22.203136 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:22.341006 29464 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 3/4 log segments. Stats: ops{read=13983 overwritten=0 applied=13982 ignored=0} inserts{seen=698800 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
W20251212 21:11:22.564136 29060 consensus_peers.cc:597] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 -> Peer c941504e89314a6a868d59585d254b81 (127.23.110.131:33221): Couldn't send request to peer c941504e89314a6a868d59585d254b81. Error code: TABLET_NOT_RUNNING (12). Status: Illegal state: Tablet not RUNNING: BOOTSTRAPPING. This is attempt 6: this message will repeat every 5th retry.
I20251212 21:11:22.634620 29464 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 4/4 log segments. Stats: ops{read=14960 overwritten=0 applied=14960 ignored=0} inserts{seen=747700 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:11:22.635418 29464 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap complete.
I20251212 21:11:22.642259 29464 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 4.411s	user 3.785s	sys 0.556s
I20251212 21:11:22.642974 29464 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 13 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:22.643290 29464 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 13 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:11:22.643432 29464 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 14960, Last appended: 11.14960, Last appended by leader: 14960, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:22.643707 29464 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:22.675637 29627 consensus_queue.cc:799] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Peer c941504e89314a6a868d59585d254b81 is lagging by at least 24 ops behind the committed index 
I20251212 21:11:22.680708 29519 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 13 FOLLOWER]: Advancing to term 14
I20251212 21:11:22.682104 29519 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 14 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 11 index: 14960. Preceding OpId from leader: term: 11 index: 14962. (index mismatch)
I20251212 21:11:22.683920 29627 consensus_queue.cc:1050] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Got LMP mismatch error from peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 14963, Last known committed idx: 14960, Time since last communication: 0.000s
I20251212 21:11:22.755115 29648 mvcc.cc:204] Tried to move back new op lower bound from 7231790623335784448 to 7231790620755701760. Current Snapshot: MvccSnapshot[applied={T|T < 7231790623114911744 or (T in {7231790623121620992,7231790623125741568,7231790623122382848})}]
I20251212 21:11:23.684646 29374 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:23.687321 29487 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:23.692919 29241 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:23.705660 29105 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251212 21:11:24.026366 25113 ts_manager.cc:284] Unset tserver state for c941504e89314a6a868d59585d254b81 from MAINTENANCE_MODE
I20251212 21:11:24.070639 29440 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:24.131124 25113 ts_manager.cc:284] Unset tserver state for 3d78cc34680848ddacf9620033efe712 from MAINTENANCE_MODE
I20251212 21:11:24.133510 29307 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:24.154840 25113 ts_manager.cc:284] Unset tserver state for 35867ec45b8041d48fc8c7bb132375c5 from MAINTENANCE_MODE
I20251212 21:11:24.167845 25113 ts_manager.cc:284] Unset tserver state for dd9e48fb810447718c09aca5a01b0fe3 from MAINTENANCE_MODE
I20251212 21:11:24.546162 25113 ts_manager.cc:295] Set tserver state for c941504e89314a6a868d59585d254b81 to MAINTENANCE_MODE
I20251212 21:11:24.547415 25116 ts_manager.cc:295] Set tserver state for dd9e48fb810447718c09aca5a01b0fe3 to MAINTENANCE_MODE
I20251212 21:11:24.622947 25116 ts_manager.cc:295] Set tserver state for 35867ec45b8041d48fc8c7bb132375c5 to MAINTENANCE_MODE
I20251212 21:11:24.640575 25116 ts_manager.cc:295] Set tserver state for 3d78cc34680848ddacf9620033efe712 to MAINTENANCE_MODE
I20251212 21:11:24.707099 29572 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:24.738302 29171 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:24.841645 29487 tablet_service.cc:1460] Tablet server c941504e89314a6a868d59585d254b81 set to quiescing
I20251212 21:11:24.841703 29487 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:24.869300 29105 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:24.869366 29105 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251212 21:11:24.891887 29374 tablet_service.cc:1460] Tablet server 3d78cc34680848ddacf9620033efe712 set to quiescing
I20251212 21:11:24.891961 29374 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:24.901800 29641 raft_consensus.cc:993] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: : Instructing follower 35867ec45b8041d48fc8c7bb132375c5 to start an election
I20251212 21:11:24.901891 29641 raft_consensus.cc:1081] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 LEADER]: Signalling peer 35867ec45b8041d48fc8c7bb132375c5 to start an election
I20251212 21:11:24.902262 29261 tablet_service.cc:2038] Received Run Leader Election RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122"
dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
 from {username='slave'} at 127.23.110.130:51327
I20251212 21:11:24.902359 29261 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 14 FOLLOWER]: Starting forced leader election (received explicit request)
I20251212 21:11:24.902387 29261 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 14 FOLLOWER]: Advancing to term 15
I20251212 21:11:24.903154 29261 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:24.903405 29261 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 15 election: Requested vote from peers c941504e89314a6a868d59585d254b81 (127.23.110.131:33221), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:24.904067 29261 raft_consensus.cc:1240] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Rejecting Update request from peer dd9e48fb810447718c09aca5a01b0fe3 for earlier term 14. Current term is 15. Ops: [14.17205-14.17205]
I20251212 21:11:24.904327 29646 consensus_queue.cc:1059] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: INVALID_TERM, Last received: 14.17204, Next index: 17205, Last known committed idx: 17204, Time since last communication: 0.000s
I20251212 21:11:24.904453 29646 raft_consensus.cc:3055] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 LEADER]: Stepping down as leader of term 14
I20251212 21:11:24.904480 29646 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 LEADER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:24.904531 29646 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 17204, Committed index: 17204, Last appended: 14.17206, Last appended by leader: 17206, Current term: 14, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:24.904584 29646 logging.cc:424] LogThrottler TrackedPeer Lag: suppressed but not reported on 1 messages since previous log ~2 seconds ago
I20251212 21:11:24.904637 29646 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 14 FOLLOWER]: Advancing to term 15
W20251212 21:11:24.905838 29637 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: Cannot assign timestamp to op. Tablet is not in leader mode. Last heard from a leader: 0.002s ago.
W20251212 21:11:24.908217 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:24.908941 29519 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 15 candidate_status { last_received { term: 14 index: 17204 } } ignore_live_leader: true dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:11:24.909034 29519 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 14 FOLLOWER]: Advancing to term 15
I20251212 21:11:24.909858 29519 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Leader election vote request: Denying vote to candidate 35867ec45b8041d48fc8c7bb132375c5 for term 15 because replica has last-logged OpId of term: 14 index: 17205, which is greater than that of the candidate, which has last-logged OpId of term: 14 index: 17204.
I20251212 21:11:24.910365 29125 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "35867ec45b8041d48fc8c7bb132375c5" candidate_term: 15 candidate_status { last_received { term: 14 index: 17204 } } ignore_live_leader: true dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3"
I20251212 21:11:24.910501 29125 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Leader election vote request: Denying vote to candidate 35867ec45b8041d48fc8c7bb132375c5 for term 15 because replica has last-logged OpId of term: 14 index: 17206, which is greater than that of the candidate, which has last-logged OpId of term: 14 index: 17204.
I20251212 21:11:24.910734 29195 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [CANDIDATE]: Term 15 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5; no voters: c941504e89314a6a868d59585d254b81, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:24.911190 29826 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Leader election lost for term 15. Reason: could not achieve majority
W20251212 21:11:24.914938 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.919286 29081 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:24.927223 29241 tablet_service.cc:1460] Tablet server 35867ec45b8041d48fc8c7bb132375c5 set to quiescing
I20251212 21:11:24.927299 29241 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:24.928171 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.936708 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.943689 29081 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.952391 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.962047 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.976107 29080 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:24.988020 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.001672 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.015769 29084 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.030361 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.048060 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.067128 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:25.071686 29440 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
W20251212 21:11:25.085709 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.107367 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.127401 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:25.134524 29307 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
W20251212 21:11:25.149942 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.172530 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.194768 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.202983 29834 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:25.218539 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.244313 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.270709 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.279114 29628 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:25.297645 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.312016 29826 raft_consensus.cc:670] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: failed to trigger leader election: Illegal state: leader elections are disabled
W20251212 21:11:25.325157 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.355198 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.388078 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.418864 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.452189 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.485118 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.518936 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.556463 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.594662 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.630527 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.669111 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.707123 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.750023 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.793633 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.838495 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.880409 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.925976 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
W20251212 21:11:25.970008 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.015318 29105 tablet_service.cc:1460] Tablet server dd9e48fb810447718c09aca5a01b0fe3 set to quiescing
I20251212 21:11:26.015386 29105 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
W20251212 21:11:26.017845 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.064582 29083 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:58502: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.070868 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29042
W20251212 21:11:26.080860 25716 meta_cache.cc:302] tablet edaf6af028f1466d9dccb7d78cf88122: replica dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255) has failed: Network error: recv got EOF from 127.23.110.130:36255 (error 108)
I20251212 21:11:26.081264 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.130:36255
--local_ip_for_outbound_sockets=127.23.110.130
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=40621
--webserver_interface=127.23.110.130
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:26.083146 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.084105 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.090911 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.105959 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.112232 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.113938 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.131772 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.144583 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.161742 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.161757 29846 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:26.161933 29846 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:26.162011 29846 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:26.163517 29846 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:26.163571 29846 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.130
I20251212 21:11:26.165130 29846 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.130:36255
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.23.110.130
--webserver_port=40621
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29846
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.130
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:26.165376 29846 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:26.165584 29846 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:26.168501 29853 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:26.168697 29846 server_base.cc:1047] running on GCE node
W20251212 21:11:26.168689 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.168486 29852 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:26.168802 29855 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:26.168978 29846 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:26.169170 29846 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:26.170323 29846 hybrid_clock.cc:648] HybridClock initialized: now 1765573886170324 us; error 41 us; skew 500 ppm
I20251212 21:11:26.171430 29846 webserver.cc:492] Webserver started at http://127.23.110.130:40621/ using document root <none> and password file <none>
I20251212 21:11:26.171622 29846 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:26.171663 29846 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:26.172856 29846 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:26.173529 29861 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.173688 29846 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.173768 29846 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
uuid: "dd9e48fb810447718c09aca5a01b0fe3"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:26.174036 29846 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
W20251212 21:11:26.180450 29221 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39996: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.198416 29846 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:26.198668 29846 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:26.198764 29846 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:26.199064 29846 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:26.199486 29868 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:26.200356 29846 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:26.200404 29846 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:26.200430 29846 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:26.201032 29846 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:26.201063 29846 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:26.201118 29868 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap starting.
I20251212 21:11:26.206588 29846 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.130:36255
I20251212 21:11:26.206640 29975 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.130:36255 every 8 connection(s)
I20251212 21:11:26.206923 29846 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-1/data/info.pb
I20251212 21:11:26.213586 29976 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:26.213697 29976 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:26.213941 29976 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:26.214527 25116 ts_manager.cc:194] Re-registered known tserver with Master: dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:26.215077 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.130:35339
I20251212 21:11:26.216001 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 29846
I20251212 21:11:26.216086 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29177
I20251212 21:11:26.226078 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.129:42339
--local_ip_for_outbound_sockets=127.23.110.129
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34247
--webserver_interface=127.23.110.129
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:26.250415 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.269085 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.285854 29868 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:26.307737 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:26.340137 29981 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:26.340369 29981 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:26.340404 29981 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:26.342715 29981 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:26.342803 29981 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.129
I20251212 21:11:26.345273 29981 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.129:42339
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.23.110.129
--webserver_port=34247
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.29981
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.129
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:26.345563 29981 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:26.345849 29981 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:26.348825 29988 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:26.348901 29987 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:26.349009 29981 server_base.cc:1047] running on GCE node
W20251212 21:11:26.349120 29990 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:26.349397 29981 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:26.349679 29981 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:26.350838 29981 hybrid_clock.cc:648] HybridClock initialized: now 1765573886350814 us; error 39 us; skew 500 ppm
I20251212 21:11:26.352253 29981 webserver.cc:492] Webserver started at http://127.23.110.129:34247/ using document root <none> and password file <none>
I20251212 21:11:26.352567 29981 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:26.352631 29981 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:26.354007 29981 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:26.354799 29996 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.355010 29981 fs_manager.cc:730] Time spent opening block manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:26.355093 29981 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
uuid: "35867ec45b8041d48fc8c7bb132375c5"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:26.355363 29981 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:26.359859 25744 meta_cache.cc:1510] marking tablet server 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339) as failed
I20251212 21:11:26.365542 29981 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:26.365775 29981 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:26.365880 29981 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:26.366073 29981 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:26.366478 30003 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:26.367605 29981 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:26.367662 29981 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:26.367710 29981 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:26.368237 29981 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:26.368268 29981 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:26.368338 30003 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap starting.
W20251212 21:11:26.374008 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.374856 29981 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.129:42339
I20251212 21:11:26.374958 30110 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.129:42339 every 8 connection(s)
I20251212 21:11:26.375219 29981 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-0/data/info.pb
W20251212 21:11:26.377067 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.381780 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 29981
I20251212 21:11:26.381875 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29310
I20251212 21:11:26.387533 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.132:41841
--local_ip_for_outbound_sockets=127.23.110.132
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=34523
--webserver_interface=127.23.110.132
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
I20251212 21:11:26.389269 30111 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:26.389389 30111 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:26.389623 30111 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:26.390280 25116 ts_manager.cc:194] Re-registered known tserver with Master: 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339)
I20251212 21:11:26.390743 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.129:54709
W20251212 21:11:26.455365 29467 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:42756: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:26.459121 30003 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Log is configured to *not* fsync() on all Append() calls
W20251212 21:11:26.473703 30114 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:26.473884 30114 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:26.473915 30114 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:26.475430 30114 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:26.475498 30114 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.132
I20251212 21:11:26.477097 30114 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.132:41841
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
--webserver_interface=127.23.110.132
--webserver_port=34523
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.30114
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.132
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:26.477368 30114 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:26.477633 30114 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:26.480198 30121 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:26.480198 30122 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:26.480399 30124 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:26.481053 30114 server_base.cc:1047] running on GCE node
I20251212 21:11:26.481221 30114 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:26.481442 30114 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:26.482605 30114 hybrid_clock.cc:648] HybridClock initialized: now 1765573886482527 us; error 92 us; skew 500 ppm
I20251212 21:11:26.483934 30114 webserver.cc:492] Webserver started at http://127.23.110.132:34523/ using document root <none> and password file <none>
I20251212 21:11:26.484129 30114 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:26.484174 30114 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:26.485459 30114 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:26.489410 30130 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.489601 30114 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.001s	sys 0.000s
I20251212 21:11:26.489686 30114 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
uuid: "3d78cc34680848ddacf9620033efe712"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:26.489993 30114 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:26.499706 30114 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:26.499956 30114 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:26.500062 30114 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:26.500272 30114 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:26.500597 30114 ts_tablet_manager.cc:585] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20251212 21:11:26.500633 30114 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.500662 30114 ts_tablet_manager.cc:616] Registered 0 tablets
I20251212 21:11:26.500715 30114 ts_tablet_manager.cc:595] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.507551 30114 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.132:41841
I20251212 21:11:26.507901 30114 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-3/data/info.pb
I20251212 21:11:26.508558 30243 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.132:41841 every 8 connection(s)
I20251212 21:11:26.513710 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 30114
I20251212 21:11:26.513856 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29443
I20251212 21:11:26.521728 30244 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:26.521957 30244 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:26.522245 30244 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:26.522651 25116 ts_manager.cc:194] Re-registered known tserver with Master: 3d78cc34680848ddacf9620033efe712 (127.23.110.132:41841)
I20251212 21:11:26.523151 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.132:44233
I20251212 21:11:26.530160 23994 external_mini_cluster.cc:1366] Running /tmp/dist-test-taskAaNqbA/build/release/bin/kudu
/tmp/dist-test-taskAaNqbA/build/release/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.23.110.131:33221
--local_ip_for_outbound_sockets=127.23.110.131
--tserver_master_addrs=127.23.110.190:45865
--webserver_port=43311
--webserver_interface=127.23.110.131
--builtin_ntp_servers=127.23.110.148:39679
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--scanner_default_batch_size_bytes=100 with env {}
W20251212 21:11:26.648808 30247 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20251212 21:11:26.649103 30247 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20251212 21:11:26.649178 30247 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20251212 21:11:26.651522 30247 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20251212 21:11:26.651655 30247 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.23.110.131
I20251212 21:11:26.654203 30247 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.23.110.148:39679
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_heartbeat_interval_ms=100
--fs_data_dirs=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.23.110.131:33221
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.23.110.131
--webserver_port=43311
--tserver_master_addrs=127.23.110.190:45865
--scanner_default_batch_size_bytes=100
--never_fsync=true
--heap_profile_path=/tmp/kudu.30247
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.23.110.131
--log_dir=/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.19.0-SNAPSHOT
revision 468897755969d686fb08386f42f7835685e7953a
build type RELEASE
built by None at 12 Dec 2025 20:43:16 UTC on 5fd53c4cbb9d
build id 9483
I20251212 21:11:26.654453 30247 env_posix.cc:2267] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20251212 21:11:26.654781 30247 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20251212 21:11:26.657696 30255 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:26.657737 30252 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20251212 21:11:26.657932 30253 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20251212 21:11:26.657974 30247 server_base.cc:1047] running on GCE node
I20251212 21:11:26.658218 30247 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20251212 21:11:26.658468 30247 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20251212 21:11:26.659621 30247 hybrid_clock.cc:648] HybridClock initialized: now 1765573886659594 us; error 40 us; skew 500 ppm
I20251212 21:11:26.661108 30247 webserver.cc:492] Webserver started at http://127.23.110.131:43311/ using document root <none> and password file <none>
I20251212 21:11:26.661347 30247 fs_manager.cc:362] Metadata directory not provided
I20251212 21:11:26.661401 30247 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20251212 21:11:26.662948 30247 fs_manager.cc:714] Time spent opening directory manager: real 0.001s	user 0.000s	sys 0.001s
I20251212 21:11:26.664055 30261 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.664212 30247 fs_manager.cc:730] Time spent opening block manager: real 0.000s	user 0.000s	sys 0.000s
I20251212 21:11:26.664280 30247 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data,/tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
uuid: "c941504e89314a6a868d59585d254b81"
format_stamp: "Formatted at 2025-12-12 21:10:47 on dist-test-slave-rz82"
I20251212 21:11:26.664597 30247 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20251212 21:11:26.690908 30247 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20251212 21:11:26.691197 30247 env_posix.cc:2267] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20251212 21:11:26.691322 30247 kserver.cc:163] Server-wide thread pool size limit: 3276
I20251212 21:11:26.691578 30247 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20251212 21:11:26.692144 30268 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20251212 21:11:26.693290 30247 ts_tablet_manager.cc:585] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20251212 21:11:26.693342 30247 ts_tablet_manager.cc:531] Time spent load tablet metadata: real 0.001s	user 0.000s	sys 0.000s
I20251212 21:11:26.693413 30247 ts_tablet_manager.cc:600] Registering tablets (0/1 complete)
I20251212 21:11:26.694135 30247 ts_tablet_manager.cc:616] Registered 1 tablets
I20251212 21:11:26.694211 30247 ts_tablet_manager.cc:595] Time spent register tablets: real 0.001s	user 0.001s	sys 0.000s
I20251212 21:11:26.694288 30268 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap starting.
I20251212 21:11:26.700819 30247 rpc_server.cc:307] RPC server started. Bound to: 127.23.110.131:33221
I20251212 21:11:26.701294 30247 server_base.cc:1179] Dumped server information to /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0/minicluster-data/ts-2/data/info.pb
I20251212 21:11:26.707993 30375 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.23.110.131:33221 every 8 connection(s)
I20251212 21:11:26.709996 23994 external_mini_cluster.cc:1428] Started /tmp/dist-test-taskAaNqbA/build/release/bin/kudu as pid 30247
I20251212 21:11:26.726263 30376 heartbeater.cc:344] Connected to a master server at 127.23.110.190:45865
I20251212 21:11:26.726372 30376 heartbeater.cc:461] Registering TS with master...
I20251212 21:11:26.726622 30376 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:26.727273 25116 ts_manager.cc:194] Re-registered known tserver with Master: c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:26.727840 25116 master_service.cc:502] Signed X509 certificate for tserver {username='slave'} at 127.23.110.131:34411
I20251212 21:11:26.793874 30268 log.cc:826] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Log is configured to *not* fsync() on all Append() calls
I20251212 21:11:26.919150 30045 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:26.925515 29910 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:26.926541 30310 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:26.929037 30160 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:27.215818 29976 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:27.262187 29868 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 1/4 log segments. Stats: ops{read=4625 overwritten=0 applied=4623 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:27.391781 30111 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:27.524130 30244 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:27.670151 30003 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 1/4 log segments. Stats: ops{read=4626 overwritten=0 applied=4623 ignored=0} inserts{seen=230950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:27.728829 30376 heartbeater.cc:499] Master 127.23.110.190:45865 was elected leader, sending a full tablet report...
I20251212 21:11:27.730250 30268 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 1/4 log segments. Stats: ops{read=4625 overwritten=0 applied=4622 ignored=0} inserts{seen=230900 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:28.576581 29868 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 2/4 log segments. Stats: ops{read=9247 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:28.660562 30268 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 2/4 log segments. Stats: ops{read=9248 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:28.996029 30003 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 2/4 log segments. Stats: ops{read=9247 overwritten=0 applied=9245 ignored=0} inserts{seen=462000 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:29.538110 30268 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 3/4 log segments. Stats: ops{read=13875 overwritten=0 applied=13873 ignored=0} inserts{seen=693350 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:29.917977 29868 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 3/4 log segments. Stats: ops{read=13868 overwritten=0 applied=13865 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
I20251212 21:11:30.170888 30268 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap replayed 4/4 log segments. Stats: ops{read=17205 overwritten=0 applied=17204 ignored=0} inserts{seen=859850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 1 replicates
I20251212 21:11:30.171370 30268 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Bootstrap complete.
I20251212 21:11:30.176748 30268 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent bootstrapping tablet: real 3.483s	user 3.003s	sys 0.459s
I20251212 21:11:30.178062 30268 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Replica starting. Triggering 1 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.178718 30268 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Becoming Follower/Learner. State: Replica: c941504e89314a6a868d59585d254b81, State: Initialized, Role: FOLLOWER
I20251212 21:11:30.178926 30268 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17204, Last appended: 14.17205, Last appended by leader: 17205, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.179140 30268 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81: Time spent starting tablet: real 0.002s	user 0.005s	sys 0.000s
I20251212 21:11:30.314234 30003 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 3/4 log segments. Stats: ops{read=13868 overwritten=0 applied=13865 ignored=0} inserts{seen=692950 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 3 replicates
W20251212 21:11:30.393146 30289 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:30.418663 30422 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:30.418819 30422 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.419160 30422 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:30.423316 29930 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 16 candidate_status { last_received { term: 14 index: 17205 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3" is_pre_election: true
I20251212 21:11:30.423300 30050 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 16 candidate_status { last_received { term: 14 index: 17205 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
W20251212 21:11:30.424481 30264 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Tablet error from VoteRequest() call to peer dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255): Illegal state: must be running to vote when last-logged opid is not known
W20251212 21:11:30.424667 30263 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Tablet error from VoteRequest() call to peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:30.424738 30263 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:30.424852 30422 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Leader pre-election lost for term 16. Reason: could not achieve majority
W20251212 21:11:30.473302 30289 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:30.679234 30289 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:30.684718 30289 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
I20251212 21:11:30.686300 29868 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap replayed 4/4 log segments. Stats: ops{read=17206 overwritten=0 applied=17204 ignored=0} inserts{seen=859850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 2 replicates
I20251212 21:11:30.686746 29868 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Bootstrap complete.
I20251212 21:11:30.691951 29868 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent bootstrapping tablet: real 4.491s	user 3.984s	sys 0.474s
I20251212 21:11:30.692840 29868 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Replica starting. Triggering 2 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.693480 29868 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Becoming Follower/Learner. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Initialized, Role: FOLLOWER
I20251212 21:11:30.693612 29868 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17204, Last appended: 14.17206, Last appended by leader: 17206, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.693807 29868 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3: Time spent starting tablet: real 0.002s	user 0.004s	sys 0.000s
I20251212 21:11:30.764689 30422 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:30.764788 30422 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.764950 30422 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), dd9e48fb810447718c09aca5a01b0fe3 (127.23.110.130:36255)
I20251212 21:11:30.765198 30050 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 16 candidate_status { last_received { term: 14 index: 17205 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:30.765197 29930 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "c941504e89314a6a868d59585d254b81" candidate_term: 16 candidate_status { last_received { term: 14 index: 17205 } } ignore_live_leader: false dest_uuid: "dd9e48fb810447718c09aca5a01b0fe3" is_pre_election: true
I20251212 21:11:30.765341 29930 raft_consensus.cc:2410] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate c941504e89314a6a868d59585d254b81 for term 16 because replica has last-logged OpId of term: 14 index: 17206, which is greater than that of the candidate, which has last-logged OpId of term: 14 index: 17205.
W20251212 21:11:30.765482 30263 leader_election.cc:343] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Tablet error from VoteRequest() call to peer 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339): Illegal state: must be running to vote when last-logged opid is not known
I20251212 21:11:30.765620 30264 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [CANDIDATE]: Term 16 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c941504e89314a6a868d59585d254b81; no voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3
I20251212 21:11:30.765724 30422 raft_consensus.cc:2749] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Leader pre-election lost for term 16. Reason: could not achieve majority
W20251212 21:11:30.901284 29889 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:41900: Illegal state: replica dd9e48fb810447718c09aca5a01b0fe3 is not leader of this config: current role FOLLOWER
I20251212 21:11:30.942994 30003 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap replayed 4/4 log segments. Stats: ops{read=17204 overwritten=0 applied=17204 ignored=0} inserts{seen=859850 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20251212 21:11:30.943475 30003 tablet_bootstrap.cc:492] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Bootstrap complete.
I20251212 21:11:30.948551 30003 ts_tablet_manager.cc:1403] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent bootstrapping tablet: real 4.580s	user 4.008s	sys 0.535s
I20251212 21:11:30.949401 30003 raft_consensus.cc:359] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.949606 30003 raft_consensus.cc:740] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Becoming Follower/Learner. State: Replica: 35867ec45b8041d48fc8c7bb132375c5, State: Initialized, Role: FOLLOWER
I20251212 21:11:30.949720 30003 consensus_queue.cc:260] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 17204, Last appended: 14.17204, Last appended by leader: 17204, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:30.949960 30003 ts_tablet_manager.cc:1434] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5: Time spent starting tablet: real 0.001s	user 0.005s	sys 0.000s
I20251212 21:11:31.006546 30429 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20251212 21:11:31.006729 30429 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:31.007057 30429 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 16 pre-election: Requested pre-vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
W20251212 21:11:31.009765 30024 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46722: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
I20251212 21:11:31.010169 30330 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 16 candidate_status { last_received { term: 14 index: 17206 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81" is_pre_election: true
I20251212 21:11:31.010296 30330 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 15.
I20251212 21:11:31.010397 30050 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 16 candidate_status { last_received { term: 14 index: 17206 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5" is_pre_election: true
I20251212 21:11:31.010515 30050 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 15.
I20251212 21:11:31.010475 29865 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 16 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: c941504e89314a6a868d59585d254b81, dd9e48fb810447718c09aca5a01b0fe3; no voters: 
I20251212 21:11:31.010610 30429 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Leader pre-election won for term 16
I20251212 21:11:31.010675 30429 raft_consensus.cc:493] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20251212 21:11:31.010700 30429 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 15 FOLLOWER]: Advancing to term 16
I20251212 21:11:31.011657 30429 raft_consensus.cc:515] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 16 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:31.011790 30429 leader_election.cc:290] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 16 election: Requested vote from peers 35867ec45b8041d48fc8c7bb132375c5 (127.23.110.129:42339), c941504e89314a6a868d59585d254b81 (127.23.110.131:33221)
I20251212 21:11:31.011932 30050 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 16 candidate_status { last_received { term: 14 index: 17206 } } ignore_live_leader: false dest_uuid: "35867ec45b8041d48fc8c7bb132375c5"
I20251212 21:11:31.011935 30330 tablet_service.cc:1911] Received RequestConsensusVote() RPC: tablet_id: "edaf6af028f1466d9dccb7d78cf88122" candidate_uuid: "dd9e48fb810447718c09aca5a01b0fe3" candidate_term: 16 candidate_status { last_received { term: 14 index: 17206 } } ignore_live_leader: false dest_uuid: "c941504e89314a6a868d59585d254b81"
I20251212 21:11:31.012005 30330 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 15 FOLLOWER]: Advancing to term 16
I20251212 21:11:31.012004 30050 raft_consensus.cc:3060] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 15 FOLLOWER]: Advancing to term 16
I20251212 21:11:31.013034 30050 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 16 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 16.
I20251212 21:11:31.013034 30330 raft_consensus.cc:2468] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 16 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dd9e48fb810447718c09aca5a01b0fe3 in term 16.
I20251212 21:11:31.013198 29863 leader_election.cc:304] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [CANDIDATE]: Term 16 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 35867ec45b8041d48fc8c7bb132375c5, dd9e48fb810447718c09aca5a01b0fe3; no voters: 
I20251212 21:11:31.013325 30429 raft_consensus.cc:2804] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 16 FOLLOWER]: Leader election won for term 16
I20251212 21:11:31.013464 30429 raft_consensus.cc:697] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [term 16 LEADER]: Becoming Leader. State: Replica: dd9e48fb810447718c09aca5a01b0fe3, State: Running, Role: LEADER
I20251212 21:11:31.013552 30429 consensus_queue.cc:237] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 17204, Committed index: 17204, Last appended: 14.17206, Last appended by leader: 17206, Current term: 16, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } }
I20251212 21:11:31.014205 25116 catalog_manager.cc:5654] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 reported cstate change: term changed from 14 to 16. New cstate: current_term: 16 leader_uuid: "dd9e48fb810447718c09aca5a01b0fe3" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dd9e48fb810447718c09aca5a01b0fe3" member_type: VOTER last_known_addr { host: "127.23.110.130" port: 36255 } health_report { overall_health: HEALTHY } } }
W20251212 21:11:31.041770 25743 scanner-internal.cc:458] Time spent opening tablet: real 5.708s	user 0.001s	sys 0.001s
I20251212 21:11:31.105422 30050 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P 35867ec45b8041d48fc8c7bb132375c5 [term 16 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 14 index: 17204. Preceding OpId from leader: term: 16 index: 17207. (index mismatch)
I20251212 21:11:31.105743 30429 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "35867ec45b8041d48fc8c7bb132375c5" member_type: VOTER last_known_addr { host: "127.23.110.129" port: 42339 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17207, Last known committed idx: 17204, Time since last communication: 0.000s
I20251212 21:11:31.107582 30330 raft_consensus.cc:1275] T edaf6af028f1466d9dccb7d78cf88122 P c941504e89314a6a868d59585d254b81 [term 16 FOLLOWER]: Refusing update from remote peer dd9e48fb810447718c09aca5a01b0fe3: Log matching property violated. Preceding OpId in replica: term: 14 index: 17205. Preceding OpId from leader: term: 16 index: 17207. (index mismatch)
I20251212 21:11:31.107867 30429 consensus_queue.cc:1048] T edaf6af028f1466d9dccb7d78cf88122 P dd9e48fb810447718c09aca5a01b0fe3 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c941504e89314a6a868d59585d254b81" member_type: VOTER last_known_addr { host: "127.23.110.131" port: 33221 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 17207, Last known committed idx: 17204, Time since last communication: 0.000s
W20251212 21:11:31.110808 30024 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46722: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:31.113003 30024 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:46722: Illegal state: replica 35867ec45b8041d48fc8c7bb132375c5 is not leader of this config: current role FOLLOWER
W20251212 21:11:31.114528 30289 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:31.114703 30290 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:31.118921 30290 tablet_service.cc:731] failed op from {username='slave'} at 127.0.0.1:39288: Illegal state: replica c941504e89314a6a868d59585d254b81 is not leader of this config: current role FOLLOWER
W20251212 21:11:31.163900 25744 scanner-internal.cc:458] Time spent opening tablet: real 6.007s	user 0.001s	sys 0.000s
W20251212 21:11:31.313553 25745 scanner-internal.cc:458] Time spent opening tablet: real 6.008s	user 0.001s	sys 0.001s
I20251212 21:11:32.165638 30160 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:32.166399 30045 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:32.178722 29910 tablet_service.cc:1467] Tablet server has 1 leaders and 3 scanners
I20251212 21:11:32.183488 30310 tablet_service.cc:1467] Tablet server has 0 leaders and 0 scanners
I20251212 21:11:32.538585 25115 ts_manager.cc:284] Unset tserver state for dd9e48fb810447718c09aca5a01b0fe3 from MAINTENANCE_MODE
I20251212 21:11:32.538914 30244 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:32.541628 25113 ts_manager.cc:284] Unset tserver state for c941504e89314a6a868d59585d254b81 from MAINTENANCE_MODE
I20251212 21:11:32.546681 25116 ts_manager.cc:284] Unset tserver state for 3d78cc34680848ddacf9620033efe712 from MAINTENANCE_MODE
I20251212 21:11:32.552753 25116 ts_manager.cc:284] Unset tserver state for 35867ec45b8041d48fc8c7bb132375c5 from MAINTENANCE_MODE
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/maintenance_mode-itest.cc:751: Failure
Value of: s.ok()
  Actual: true
Expected: false
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/test_util.cc:403: Failure
Failed
Timed out waiting for assertion to pass.
I20251212 21:11:33.110498 30111 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:33.113485 30376 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:33.114137 29976 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:33.542985 30244 heartbeater.cc:507] Master 127.23.110.190:45865 requested a full tablet report, sending...
I20251212 21:11:34.215127 23994 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20251212 21:11:34.215240 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 35867ec45b8041d48fc8c7bb132375c5 and pid 29981
************************ BEGIN STACKS **************************
[New LWP 29983]
[New LWP 29984]
[New LWP 29985]
[New LWP 29986]
[New LWP 29992]
[New LWP 29993]
[New LWP 29994]
[New LWP 29997]
[New LWP 29998]
[New LWP 29999]
[New LWP 30000]
[New LWP 30001]
[New LWP 30002]
[New LWP 30004]
[New LWP 30005]
[New LWP 30006]
[New LWP 30007]
[New LWP 30008]
[New LWP 30009]
[New LWP 30010]
[New LWP 30011]
[New LWP 30012]
[New LWP 30013]
[New LWP 30014]
[New LWP 30015]
[New LWP 30016]
[New LWP 30017]
[New LWP 30018]
[New LWP 30019]
[New LWP 30020]
[New LWP 30021]
[New LWP 30022]
[New LWP 30023]
[New LWP 30024]
[New LWP 30025]
[New LWP 30026]
[New LWP 30027]
[New LWP 30028]
[New LWP 30029]
[New LWP 30030]
[New LWP 30031]
[New LWP 30032]
[New LWP 30033]
[New LWP 30034]
[New LWP 30035]
[New LWP 30036]
[New LWP 30037]
[New LWP 30038]
[New LWP 30039]
[New LWP 30040]
[New LWP 30041]
[New LWP 30042]
[New LWP 30043]
[New LWP 30044]
[New LWP 30045]
[New LWP 30046]
[New LWP 30047]
[New LWP 30048]
[New LWP 30049]
[New LWP 30050]
[New LWP 30051]
[New LWP 30052]
[New LWP 30053]
[New LWP 30054]
[New LWP 30055]
[New LWP 30056]
[New LWP 30057]
[New LWP 30058]
[New LWP 30059]
[New LWP 30060]
[New LWP 30061]
[New LWP 30062]
[New LWP 30063]
[New LWP 30064]
[New LWP 30065]
[New LWP 30066]
[New LWP 30067]
[New LWP 30068]
[New LWP 30069]
[New LWP 30070]
[New LWP 30071]
[New LWP 30072]
[New LWP 30073]
[New LWP 30074]
[New LWP 30075]
[New LWP 30076]
[New LWP 30077]
[New LWP 30078]
[New LWP 30079]
[New LWP 30080]
[New LWP 30081]
[New LWP 30082]
[New LWP 30083]
[New LWP 30084]
[New LWP 30085]
[New LWP 30086]
[New LWP 30087]
[New LWP 30088]
[New LWP 30089]
[New LWP 30090]
[New LWP 30091]
[New LWP 30092]
[New LWP 30093]
[New LWP 30094]
[New LWP 30095]
[New LWP 30096]
[New LWP 30097]
[New LWP 30098]
[New LWP 30099]
[New LWP 30100]
[New LWP 30101]
[New LWP 30102]
[New LWP 30103]
[New LWP 30104]
[New LWP 30105]
[New LWP 30106]
[New LWP 30107]
[New LWP 30108]
[New LWP 30109]
[New LWP 30110]
[New LWP 30111]
[New LWP 30112]
0x00007f2acb221d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 29981 "kudu"  0x00007f2acb221d50 in ?? ()
  2    LWP 29983 "kudu"  0x00007f2acb21dfb9 in ?? ()
  3    LWP 29984 "kudu"  0x00007f2acb21dfb9 in ?? ()
  4    LWP 29985 "kudu"  0x00007f2acb21dfb9 in ?? ()
  5    LWP 29986 "kernel-watcher-" 0x00007f2acb21dfb9 in ?? ()
  6    LWP 29992 "ntp client-2999" 0x00007f2acb2219e2 in ?? ()
  7    LWP 29993 "file cache-evic" 0x00007f2acb21dfb9 in ?? ()
  8    LWP 29994 "sq_acceptor" 0x00007f2ac9332cb9 in ?? ()
  9    LWP 29997 "rpc reactor-299" 0x00007f2ac933fa47 in ?? ()
  10   LWP 29998 "rpc reactor-299" 0x00007f2ac933fa47 in ?? ()
  11   LWP 29999 "rpc reactor-299" 0x00007f2ac933fa47 in ?? ()
  12   LWP 30000 "rpc reactor-300" 0x00007f2ac933fa47 in ?? ()
  13   LWP 30001 "MaintenanceMgr " 0x00007f2acb21dad3 in ?? ()
  14   LWP 30002 "txn-status-mana" 0x00007f2acb21dfb9 in ?? ()
  15   LWP 30004 "collect_and_rem" 0x00007f2acb21dfb9 in ?? ()
  16   LWP 30005 "tc-session-exp-" 0x00007f2acb21dfb9 in ?? ()
  17   LWP 30006 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  18   LWP 30007 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  19   LWP 30008 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  20   LWP 30009 "rpc worker-3000" 0x00007f2acb21dad3 in ?? ()
  21   LWP 30010 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  22   LWP 30011 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  23   LWP 30012 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  24   LWP 30013 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  25   LWP 30014 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  26   LWP 30015 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  27   LWP 30016 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  28   LWP 30017 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  29   LWP 30018 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  30   LWP 30019 "rpc worker-3001" 0x00007f2acb21dad3 in ?? ()
  31   LWP 30020 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  32   LWP 30021 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  33   LWP 30022 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  34   LWP 30023 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  35   LWP 30024 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  36   LWP 30025 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  37   LWP 30026 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  38   LWP 30027 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  39   LWP 30028 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  40   LWP 30029 "rpc worker-3002" 0x00007f2acb21dad3 in ?? ()
  41   LWP 30030 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  42   LWP 30031 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  43   LWP 30032 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  44   LWP 30033 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  45   LWP 30034 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  46   LWP 30035 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  47   LWP 30036 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  48   LWP 30037 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  49   LWP 30038 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  50   LWP 30039 "rpc worker-3003" 0x00007f2acb21dad3 in ?? ()
  51   LWP 30040 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  52   LWP 30041 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  53   LWP 30042 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  54   LWP 30043 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  55   LWP 30044 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  56   LWP 30045 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  57   LWP 30046 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  58   LWP 30047 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  59   LWP 30048 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  60   LWP 30049 "rpc worker-3004" 0x00007f2acb21dad3 in ?? ()
  61   LWP 30050 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  62   LWP 30051 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  63   LWP 30052 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  64   LWP 30053 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  65   LWP 30054 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  66   LWP 30055 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  67   LWP 30056 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  68   LWP 30057 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  69   LWP 30058 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  70   LWP 30059 "rpc worker-3005" 0x00007f2acb21dad3 in ?? ()
  71   LWP 30060 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  72   LWP 30061 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  73   LWP 30062 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  74   LWP 30063 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  75   LWP 30064 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  76   LWP 30065 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  77   LWP 30066 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  78   LWP 30067 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  79   LWP 30068 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  80   LWP 30069 "rpc worker-3006" 0x00007f2acb21dad3 in ?? ()
  81   LWP 30070 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  82   LWP 30071 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  83   LWP 30072 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  84   LWP 30073 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  85   LWP 30074 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  86   LWP 30075 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  87   LWP 30076 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  88   LWP 30077 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  89   LWP 30078 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  90   LWP 30079 "rpc worker-3007" 0x00007f2acb21dad3 in ?? ()
  91   LWP 30080 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  92   LWP 30081 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  93   LWP 30082 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  94   LWP 30083 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  95   LWP 30084 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  96   LWP 30085 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  97   LWP 30086 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  98   LWP 30087 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  99   LWP 30088 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  100  LWP 30089 "rpc worker-3008" 0x00007f2acb21dad3 in ?? ()
  101  LWP 30090 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  102  LWP 30091 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  103  LWP 30092 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  104  LWP 30093 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  105  LWP 30094 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  106  LWP 30095 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  107  LWP 30096 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  108  LWP 30097 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  109  LWP 30098 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  110  LWP 30099 "rpc worker-3009" 0x00007f2acb21dad3 in ?? ()
  111  LWP 30100 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  112  LWP 30101 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  113  LWP 30102 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  114  LWP 30103 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  115  LWP 30104 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  116  LWP 30105 "rpc worker-3010" 0x00007f2acb21dad3 in ?? ()
  117  LWP 30106 "diag-logger-301" 0x00007f2acb21dfb9 in ?? ()
  118  LWP 30107 "result-tracker-" 0x00007f2acb21dfb9 in ?? ()
  119  LWP 30108 "excess-log-dele" 0x00007f2acb21dfb9 in ?? ()
  120  LWP 30109 "tcmalloc-memory" 0x00007f2acb21dfb9 in ?? ()
  121  LWP 30110 "acceptor-30110" 0x00007f2ac93410c7 in ?? ()
  122  LWP 30111 "heartbeat-30111" 0x00007f2acb21dfb9 in ?? ()
  123  LWP 30112 "maintenance_sch" 0x00007f2acb21dfb9 in ?? ()

Thread 123 (LWP 30112):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000021 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c47a7e50 in ?? ()
#5  0x00007f2a81ccd470 in ?? ()
#6  0x0000000000000042 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 30111):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000a in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00005640c46f7934 in ?? ()
#5  0x00007f2a824ce3f0 in ?? ()
#6  0x0000000000000015 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007f2a824ce410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007f2a824ce470 in ?? ()
#12 0x00007f2acae91711 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 30110):
#0  0x00007f2ac93410c7 in ?? ()
#1  0x00007f2a82ccf020 in ?? ()
#2  0x00007f2acaea1ec2 in ?? ()
#3  0x00007f2a82ccf020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f2a82ccf3e0 in ?? ()
#6  0x00007f2a82ccf090 in ?? ()
#7  0x00005640c46a24c8 in ?? ()
#8  0x00007f2acaea7959 in ?? ()
#9  0x00007f2a82ccf510 in ?? ()
#10 0x00007f2a82ccf700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f2acb2213a7 in ?? ()
#13 0x00007f2a82cd0520 in ?? ()
#14 0x00007f2a82ccf260 in ?? ()
#15 0x00005640c4753140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 30109):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007fff3cdc9230 in ?? ()
#5  0x00007f2a834d0670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 30108):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 30107):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c462bb70 in ?? ()
#5  0x00007f2a844d2680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 30106):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c49a9390 in ?? ()
#5  0x00007f2a84cd3550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 30105):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 30104):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 30103):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 30102):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 30101):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c49ad83c in ?? ()
#4  0x00007f2a874d85c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2a874d85e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c49ad828 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2a874d85e0 in ?? ()
#11 0x00007f2a874d8640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 111 (LWP 30100):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 30099):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 30098):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 30097):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 30096):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 30095):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 30094):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 30093):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 30092):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 30091):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 30090):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 30089):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 30088):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 30087):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 30086):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c49ad8bc in ?? ()
#4  0x00007f2a8ece75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2a8ece75e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c49ad8a8 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2a8ece75e0 in ?? ()
#11 0x00007f2a8ece7640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 96 (LWP 30085):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 30084):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 30083):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 30082):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 30081):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 30080):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 30079):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 30078):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 30077):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 30076):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 30075):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 30074):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 30073):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 30072):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 30071):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 30070):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 30069):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 30068):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 30067):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 30066):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 30065):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 30064):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 30063):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 30062):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 30061):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 30060):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 30059):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 30058):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 30057):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 30056):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 30055):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 30054):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 30053):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 30052):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 30051):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 30050):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000247 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c4973c3c in ?? ()
#4  0x00007f2aa0d0b5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aa0d0b5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c4973c28 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2aa0d0b5e0 in ?? ()
#11 0x00007f2aa0d0b640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 60 (LWP 30049):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x00000000000002cf in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00005640c4973bbc in ?? ()
#4  0x00007f2aa150c5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aa150c5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00005640c4973ba8 in ?? ()
#9  0x00007f2acb21d770 in ?? ()
#10 0x00007f2aa150c5e0 in ?? ()
#11 0x00007f2aa150c640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 59 (LWP 30048):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 30047):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 30046):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 30045):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005640c4972638 in ?? ()
#4  0x00007f2aa35105c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aa35105e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 30044):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 30043):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 30042):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 30041):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 30040):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 30039):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 30038):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 30037):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 30036):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 30035):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 30034):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 30033):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 30032):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 30031):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 30030):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 30029):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 30028):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 30027):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 30026):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 30025):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005640c4897b38 in ?? ()
#4  0x00007f2aad5245c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aad5245e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 30024):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000050 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00005640c4897ab8 in ?? ()
#4  0x00007f2aadd255c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2aadd255e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 34 (LWP 30023):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 30022):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 30021):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 30020):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 30019):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 30018):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 30017):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 30016):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 30015):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 30014):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 30013):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 30012):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 30011):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 30010):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 30009):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 30008):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 30007):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 30006):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 30005):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 30004):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c46116c8 in ?? ()
#5  0x00007f2ab7d396a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 30002):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 30001):
#0  0x00007f2acb21dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 30000):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2ab9d3d680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4709398 in ?? ()
#5  0x00007f2ab9d3d6c0 in ?? ()
#6  0x00007f2ab9d3d840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb96691efcec000 in ?? ()
#10 0x00005640c46fac00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46fac00 in ?? ()
#13 0x00000000c4709398 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f9a0 in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2ab9d3d720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96691efcec000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 29999):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2aba53e680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4709018 in ?? ()
#5  0x00007f2aba53e6c0 in ?? ()
#6  0x00007f2aba53e840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb97033da19c000 in ?? ()
#10 0x00005640c46f9b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46f9b80 in ?? ()
#13 0x00000000c4709018 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f99c in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2aba53e720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb97033da19c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 29998):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2abad3f680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4709558 in ?? ()
#5  0x00007f2abad3f6c0 in ?? ()
#6  0x00007f2abad3f840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb96404d5520000 in ?? ()
#10 0x00005640c46fa100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46fa100 in ?? ()
#13 0x00000000c4709558 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2abad3f720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb96404d5520000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 29997):
#0  0x00007f2ac933fa47 in ?? ()
#1  0x00007f2abc921680 in ?? ()
#2  0x00007f2ac4643571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x00005640c4708e58 in ?? ()
#5  0x00007f2abc9216c0 in ?? ()
#6  0x00007f2abc921840 in ?? ()
#7  0x00005640c47ae670 in ?? ()
#8  0x00007f2ac464525d in ?? ()
#9  0x3fb9894f81618000 in ?? ()
#10 0x00005640c46f9600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x00005640c46f9600 in ?? ()
#13 0x00000000c4708e58 in ?? ()
#14 0x0000564000000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x00005640c47ae670 in ?? ()
#17 0x00007f2abc921720 in ?? ()
#18 0x00007f2ac4649ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 29994):
#0  0x00007f2ac9332cb9 in ?? ()
#1  0x00007f2abe124840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 29993):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 29992):
#0  0x00007f2acb2219e2 in ?? ()
#1  0x00005640c462bee0 in ?? ()
#2  0x00007f2abd1224d0 in ?? ()
#3  0x00007f2abd122450 in ?? ()
#4  0x00007f2abd122570 in ?? ()
#5  0x00007f2abd122790 in ?? ()
#6  0x00007f2abd1227a0 in ?? ()
#7  0x00007f2abd1224e0 in ?? ()
#8  0x00007f2abd1224d0 in ?? ()
#9  0x00005640c462a350 in ?? ()
#10 0x00007f2acb60cc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 29986):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000029 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c47b0dc8 in ?? ()
#5  0x00007f2abf126430 in ?? ()
#6  0x0000000000000052 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 29985):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c4610848 in ?? ()
#5  0x00007f2abf927790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 29984):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c46102a8 in ?? ()
#5  0x00007f2ac0128790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 29983):
#0  0x00007f2acb21dfb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00005640c4610188 in ?? ()
#5  0x00007f2ac0929790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 29981):
#0  0x00007f2acb221d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:34.734045 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID dd9e48fb810447718c09aca5a01b0fe3 and pid 29846
************************ BEGIN STACKS **************************
[New LWP 29848]
[New LWP 29849]
[New LWP 29850]
[New LWP 29851]
[New LWP 29857]
[New LWP 29858]
[New LWP 29859]
[New LWP 29862]
[New LWP 29863]
[New LWP 29864]
[New LWP 29865]
[New LWP 29866]
[New LWP 29867]
[New LWP 29869]
[New LWP 29870]
[New LWP 29871]
[New LWP 29872]
[New LWP 29873]
[New LWP 29874]
[New LWP 29875]
[New LWP 29876]
[New LWP 29877]
[New LWP 29878]
[New LWP 29879]
[New LWP 29880]
[New LWP 29881]
[New LWP 29882]
[New LWP 29883]
[New LWP 29884]
[New LWP 29885]
[New LWP 29886]
[New LWP 29887]
[New LWP 29888]
[New LWP 29889]
[New LWP 29890]
[New LWP 29891]
[New LWP 29892]
[New LWP 29893]
[New LWP 29894]
[New LWP 29895]
[New LWP 29896]
[New LWP 29897]
[New LWP 29898]
[New LWP 29899]
[New LWP 29900]
[New LWP 29901]
[New LWP 29902]
[New LWP 29903]
[New LWP 29904]
[New LWP 29905]
[New LWP 29906]
[New LWP 29907]
[New LWP 29908]
[New LWP 29909]
[New LWP 29910]
[New LWP 29911]
[New LWP 29912]
[New LWP 29913]
[New LWP 29914]
[New LWP 29915]
[New LWP 29916]
[New LWP 29917]
[New LWP 29918]
[New LWP 29919]
[New LWP 29920]
[New LWP 29921]
[New LWP 29922]
[New LWP 29923]
[New LWP 29924]
[New LWP 29925]
[New LWP 29926]
[New LWP 29927]
[New LWP 29928]
[New LWP 29929]
[New LWP 29930]
[New LWP 29931]
[New LWP 29932]
[New LWP 29933]
[New LWP 29934]
[New LWP 29935]
[New LWP 29936]
[New LWP 29937]
[New LWP 29938]
[New LWP 29939]
[New LWP 29940]
[New LWP 29941]
[New LWP 29942]
[New LWP 29943]
[New LWP 29944]
[New LWP 29945]
[New LWP 29946]
[New LWP 29947]
[New LWP 29948]
[New LWP 29949]
[New LWP 29950]
[New LWP 29951]
[New LWP 29952]
[New LWP 29953]
[New LWP 29954]
[New LWP 29955]
[New LWP 29956]
[New LWP 29957]
[New LWP 29958]
[New LWP 29959]
[New LWP 29960]
[New LWP 29961]
[New LWP 29962]
[New LWP 29963]
[New LWP 29964]
[New LWP 29965]
[New LWP 29966]
[New LWP 29967]
[New LWP 29968]
[New LWP 29969]
[New LWP 29970]
[New LWP 29971]
[New LWP 29972]
[New LWP 29973]
[New LWP 29974]
[New LWP 29975]
[New LWP 29976]
[New LWP 29977]
[New LWP 30455]
0x00007fdf36f12d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 29846 "kudu"  0x00007fdf36f12d50 in ?? ()
  2    LWP 29848 "kudu"  0x00007fdf36f0efb9 in ?? ()
  3    LWP 29849 "kudu"  0x00007fdf36f0efb9 in ?? ()
  4    LWP 29850 "kudu"  0x00007fdf36f0efb9 in ?? ()
  5    LWP 29851 "kernel-watcher-" 0x00007fdf36f0efb9 in ?? ()
  6    LWP 29857 "ntp client-2985" 0x00007fdf36f129e2 in ?? ()
  7    LWP 29858 "file cache-evic" 0x00007fdf36f0efb9 in ?? ()
  8    LWP 29859 "sq_acceptor" 0x00007fdf35023cb9 in ?? ()
  9    LWP 29862 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  10   LWP 29863 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  11   LWP 29864 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  12   LWP 29865 "rpc reactor-298" 0x00007fdf35030a47 in ?? ()
  13   LWP 29866 "MaintenanceMgr " 0x00007fdf36f0ead3 in ?? ()
  14   LWP 29867 "txn-status-mana" 0x00007fdf36f0efb9 in ?? ()
  15   LWP 29869 "collect_and_rem" 0x00007fdf36f0efb9 in ?? ()
  16   LWP 29870 "tc-session-exp-" 0x00007fdf36f0efb9 in ?? ()
  17   LWP 29871 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  18   LWP 29872 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  19   LWP 29873 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  20   LWP 29874 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  21   LWP 29875 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  22   LWP 29876 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  23   LWP 29877 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  24   LWP 29878 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  25   LWP 29879 "rpc worker-2987" 0x00007fdf36f0ead3 in ?? ()
  26   LWP 29880 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  27   LWP 29881 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  28   LWP 29882 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  29   LWP 29883 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  30   LWP 29884 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  31   LWP 29885 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  32   LWP 29886 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  33   LWP 29887 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  34   LWP 29888 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  35   LWP 29889 "rpc worker-2988" 0x00007fdf36f0ead3 in ?? ()
  36   LWP 29890 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  37   LWP 29891 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  38   LWP 29892 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  39   LWP 29893 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  40   LWP 29894 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  41   LWP 29895 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  42   LWP 29896 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  43   LWP 29897 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  44   LWP 29898 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  45   LWP 29899 "rpc worker-2989" 0x00007fdf36f0ead3 in ?? ()
  46   LWP 29900 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  47   LWP 29901 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  48   LWP 29902 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  49   LWP 29903 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  50   LWP 29904 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  51   LWP 29905 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  52   LWP 29906 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  53   LWP 29907 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  54   LWP 29908 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  55   LWP 29909 "rpc worker-2990" 0x00007fdf36f0ead3 in ?? ()
  56   LWP 29910 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  57   LWP 29911 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  58   LWP 29912 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  59   LWP 29913 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  60   LWP 29914 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  61   LWP 29915 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  62   LWP 29916 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  63   LWP 29917 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  64   LWP 29918 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  65   LWP 29919 "rpc worker-2991" 0x00007fdf36f0ead3 in ?? ()
  66   LWP 29920 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  67   LWP 29921 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  68   LWP 29922 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  69   LWP 29923 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  70   LWP 29924 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  71   LWP 29925 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  72   LWP 29926 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  73   LWP 29927 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  74   LWP 29928 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  75   LWP 29929 "rpc worker-2992" 0x00007fdf36f0ead3 in ?? ()
  76   LWP 29930 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  77   LWP 29931 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  78   LWP 29932 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  79   LWP 29933 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  80   LWP 29934 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  81   LWP 29935 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  82   LWP 29936 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  83   LWP 29937 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  84   LWP 29938 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  85   LWP 29939 "rpc worker-2993" 0x00007fdf36f0ead3 in ?? ()
  86   LWP 29940 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  87   LWP 29941 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  88   LWP 29942 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  89   LWP 29943 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  90   LWP 29944 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  91   LWP 29945 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  92   LWP 29946 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  93   LWP 29947 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  94   LWP 29948 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  95   LWP 29949 "rpc worker-2994" 0x00007fdf36f0ead3 in ?? ()
  96   LWP 29950 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  97   LWP 29951 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  98   LWP 29952 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  99   LWP 29953 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  100  LWP 29954 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  101  LWP 29955 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  102  LWP 29956 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  103  LWP 29957 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  104  LWP 29958 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  105  LWP 29959 "rpc worker-2995" 0x00007fdf36f0ead3 in ?? ()
  106  LWP 29960 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  107  LWP 29961 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  108  LWP 29962 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  109  LWP 29963 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  110  LWP 29964 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  111  LWP 29965 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  112  LWP 29966 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  113  LWP 29967 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  114  LWP 29968 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  115  LWP 29969 "rpc worker-2996" 0x00007fdf36f0ead3 in ?? ()
  116  LWP 29970 "rpc worker-2997" 0x00007fdf36f0ead3 in ?? ()
  117  LWP 29971 "diag-logger-299" 0x00007fdf36f0efb9 in ?? ()
  118  LWP 29972 "result-tracker-" 0x00007fdf36f0efb9 in ?? ()
  119  LWP 29973 "excess-log-dele" 0x00007fdf36f0efb9 in ?? ()
  120  LWP 29974 "tcmalloc-memory" 0x00007fdf36f0efb9 in ?? ()
  121  LWP 29975 "acceptor-29975" 0x00007fdf350320c7 in ?? ()
  122  LWP 29976 "heartbeat-29976" 0x00007fdf36f0efb9 in ?? ()
  123  LWP 29977 "maintenance_sch" 0x00007fdf36f0efb9 in ?? ()
  124  LWP 30455 "raft [worker]-3" 0x00007fdf36f0efb9 in ?? ()

Thread 124 (LWP 30455):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x00000000000004e3 in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007fdee89b4764 in ?? ()
#5  0x00007fdee89b4510 in ?? ()
#6  0x00000000000009c7 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fdee89b4530 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fdee89b4590 in ?? ()
#12 0x00007fdf36b82711 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 123 (LWP 29977):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000023 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56d45e50 in ?? ()
#5  0x00007fdeed9be470 in ?? ()
#6  0x0000000000000046 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 29976):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56c95930 in ?? ()
#5  0x00007fdeee1bf3f0 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 29975):
#0  0x00007fdf350320c7 in ?? ()
#1  0x00007fdeee9c0020 in ?? ()
#2  0x00007fdf36b92ec2 in ?? ()
#3  0x00007fdeee9c0020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fdeee9c03e0 in ?? ()
#6  0x00007fdeee9c0090 in ?? ()
#7  0x0000561c56c404c8 in ?? ()
#8  0x00007fdf36b98959 in ?? ()
#9  0x00007fdeee9c0510 in ?? ()
#10 0x00007fdeee9c0700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fdf36f123a7 in ?? ()
#13 0x00007fdeee9c1520 in ?? ()
#14 0x00007fdeee9c0260 in ?? ()
#15 0x0000561c56cf1140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 29974):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe894cdd70 in ?? ()
#5  0x00007fdeef1c1670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 29973):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 29972):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bc9b70 in ?? ()
#5  0x00007fdef01c3680 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 29971):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000008 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56f51390 in ?? ()
#5  0x00007fdef09c4550 in ?? ()
#6  0x0000000000000010 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 29970):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56f5273c in ?? ()
#4  0x00007fdef11c55c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdef11c55e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56f52728 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdef11c55e0 in ?? ()
#11 0x00007fdef11c5640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 29969):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56f526bc in ?? ()
#4  0x00007fdef19c65c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdef19c65e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56f526a8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdef19c65e0 in ?? ()
#11 0x00007fdef19c6640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 29968):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 29967):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 29966):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 29965):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 29964):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 29963):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 29962):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 29961):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 29960):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 29959):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 29958):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 29957):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 29956):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 29955):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 29954):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 29953):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 29952):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 29951):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 29950):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 29949):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 29948):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 29947):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 29946):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 29945):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 29944):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 29943):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 29942):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 29941):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 29940):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 29939):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 29938):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 29937):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 29936):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 29935):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 29934):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 29933):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 29932):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 29931):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 29930):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000004 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x0000561c56f1b138 in ?? ()
#4  0x00007fdf051ed5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf051ed5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 75 (LWP 29929):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 29928):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 29927):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 29926):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 29925):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 29924):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 29923):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 29922):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 29921):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 29920):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 29919):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 29918):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 29917):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 29916):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 29915):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 29914):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 29913):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 29912):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 29911):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 29910):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x0000561c56f1a638 in ?? ()
#4  0x00007fdf0f2015c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf0f2015e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 29909):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 29908):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 29907):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 29906):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 29905):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 29904):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 29903):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 29902):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 29901):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 29900):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 29899):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 29898):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 29897):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 29896):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 29895):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 29894):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 29893):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 29892):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 29891):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 29890):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000330 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x0000561c56e35b38 in ?? ()
#4  0x00007fdf192155c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf192155e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 29889):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000585 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e35abc in ?? ()
#4  0x00007fdf19a165c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf19a165e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e35aa8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf19a165e0 in ?? ()
#11 0x00007fdf19a16640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 29888):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x00000000000019f3 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e35a3c in ?? ()
#4  0x00007fdf1a2175c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1a2175e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e35a28 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1a2175e0 in ?? ()
#11 0x00007fdf1a217640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 33 (LWP 29887):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x000000000000051d in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e359bc in ?? ()
#4  0x00007fdf1aa185c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1aa185e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e359a8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1aa185e0 in ?? ()
#11 0x00007fdf1aa18640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 32 (LWP 29886):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x000000000000182f in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e3593c in ?? ()
#4  0x00007fdf1b2195c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1b2195e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e35928 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1b2195e0 in ?? ()
#11 0x00007fdf1b219640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 31 (LWP 29885):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x00000000000018c7 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x0000561c56e358bc in ?? ()
#4  0x00007fdf1ba1a5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fdf1ba1a5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000561c56e358a8 in ?? ()
#9  0x00007fdf36f0e770 in ?? ()
#10 0x00007fdf1ba1a5e0 in ?? ()
#11 0x00007fdf1ba1a640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 30 (LWP 29884):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 29883):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 29882):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 29881):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 29880):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 29879):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 29878):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 29877):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 29876):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 29875):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 29874):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 29873):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 29872):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 29871):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 29870):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 29869):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56baf6c8 in ?? ()
#5  0x00007fdf23a2a6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 29867):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 29866):
#0  0x00007fdf36f0ead3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 29865):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf25a2e680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca7398 in ?? ()
#5  0x00007fdf25a2e6c0 in ?? ()
#6  0x00007fdf25a2e840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3fb961304f278000 in ?? ()
#10 0x0000561c56c98c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c98c00 in ?? ()
#13 0x0000000056ca7398 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f99b in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf25a2e720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb961304f278000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 29864):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf2622f680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca7018 in ?? ()
#5  0x00007fdf2622f6c0 in ?? ()
#6  0x00007fdf2622f840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3faeadec9e098000 in ?? ()
#10 0x0000561c56c98680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c98680 in ?? ()
#13 0x0000000056ca7018 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f9a0 in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf2622f720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3faeadec9e098000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 29863):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf26a30680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca7558 in ?? ()
#5  0x00007fdf26a306c0 in ?? ()
#6  0x00007fdf26a30840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3f8a14f210ba0000 in ?? ()
#10 0x0000561c56c98100 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c98100 in ?? ()
#13 0x0000000056ca7558 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f9a0 in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf26a30720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3f8a14f210ba0000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 29862):
#0  0x00007fdf35030a47 in ?? ()
#1  0x00007fdf28612680 in ?? ()
#2  0x00007fdf30334571 in ?? ()
#3  0x00000000000002de in ?? ()
#4  0x0000561c56ca6e58 in ?? ()
#5  0x00007fdf286126c0 in ?? ()
#6  0x00007fdf28612840 in ?? ()
#7  0x0000561c56d4c670 in ?? ()
#8  0x00007fdf3033625d in ?? ()
#9  0x3fb95a821712c000 in ?? ()
#10 0x0000561c56c97600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x0000561c56c97600 in ?? ()
#13 0x0000000056ca6e58 in ?? ()
#14 0x0000561c00000000 in ?? ()
#15 0x41da4f1fed70f99b in ?? ()
#16 0x0000561c56d4c670 in ?? ()
#17 0x00007fdf28612720 in ?? ()
#18 0x00007fdf3033aba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 29859):
#0  0x00007fdf35023cb9 in ?? ()
#1  0x00007fdf29e15840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 29858):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 29857):
#0  0x00007fdf36f129e2 in ?? ()
#1  0x0000561c56bc9ee0 in ?? ()
#2  0x00007fdf28e134d0 in ?? ()
#3  0x00007fdf28e13450 in ?? ()
#4  0x00007fdf28e13570 in ?? ()
#5  0x00007fdf28e13790 in ?? ()
#6  0x00007fdf28e137a0 in ?? ()
#7  0x00007fdf28e134e0 in ?? ()
#8  0x00007fdf28e134d0 in ?? ()
#9  0x0000561c56bc8350 in ?? ()
#10 0x00007fdf372fdc6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 29851):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56d4edc8 in ?? ()
#5  0x00007fdf2ae17430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 29850):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bae848 in ?? ()
#5  0x00007fdf2b618790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 29849):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bae2a8 in ?? ()
#5  0x00007fdf2be19790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 29848):
#0  0x00007fdf36f0efb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x0000561c56bae188 in ?? ()
#5  0x00007fdf2c61a790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 29846):
#0  0x00007fdf36f12d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:35.254654 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID c941504e89314a6a868d59585d254b81 and pid 30247
************************ BEGIN STACKS **************************
[New LWP 30248]
[New LWP 30249]
[New LWP 30250]
[New LWP 30251]
[New LWP 30257]
[New LWP 30258]
[New LWP 30259]
[New LWP 30262]
[New LWP 30263]
[New LWP 30264]
[New LWP 30265]
[New LWP 30266]
[New LWP 30267]
[New LWP 30269]
[New LWP 30270]
[New LWP 30271]
[New LWP 30272]
[New LWP 30273]
[New LWP 30274]
[New LWP 30275]
[New LWP 30276]
[New LWP 30277]
[New LWP 30278]
[New LWP 30279]
[New LWP 30280]
[New LWP 30281]
[New LWP 30282]
[New LWP 30283]
[New LWP 30284]
[New LWP 30285]
[New LWP 30286]
[New LWP 30287]
[New LWP 30288]
[New LWP 30289]
[New LWP 30290]
[New LWP 30291]
[New LWP 30292]
[New LWP 30293]
[New LWP 30294]
[New LWP 30295]
[New LWP 30296]
[New LWP 30297]
[New LWP 30298]
[New LWP 30299]
[New LWP 30300]
[New LWP 30301]
[New LWP 30302]
[New LWP 30303]
[New LWP 30304]
[New LWP 30305]
[New LWP 30306]
[New LWP 30307]
[New LWP 30308]
[New LWP 30309]
[New LWP 30310]
[New LWP 30311]
[New LWP 30312]
[New LWP 30313]
[New LWP 30314]
[New LWP 30315]
[New LWP 30316]
[New LWP 30317]
[New LWP 30318]
[New LWP 30319]
[New LWP 30320]
[New LWP 30321]
[New LWP 30322]
[New LWP 30323]
[New LWP 30324]
[New LWP 30325]
[New LWP 30326]
[New LWP 30327]
[New LWP 30328]
[New LWP 30329]
[New LWP 30330]
[New LWP 30331]
[New LWP 30332]
[New LWP 30333]
[New LWP 30334]
[New LWP 30335]
[New LWP 30336]
[New LWP 30337]
[New LWP 30338]
[New LWP 30339]
[New LWP 30340]
[New LWP 30341]
[New LWP 30342]
[New LWP 30343]
[New LWP 30344]
[New LWP 30345]
[New LWP 30346]
[New LWP 30347]
[New LWP 30348]
[New LWP 30349]
[New LWP 30350]
[New LWP 30351]
[New LWP 30352]
[New LWP 30353]
[New LWP 30354]
[New LWP 30355]
[New LWP 30356]
[New LWP 30357]
[New LWP 30358]
[New LWP 30359]
[New LWP 30360]
[New LWP 30361]
[New LWP 30362]
[New LWP 30363]
[New LWP 30364]
[New LWP 30365]
[New LWP 30366]
[New LWP 30367]
[New LWP 30368]
[New LWP 30369]
[New LWP 30370]
[New LWP 30371]
[New LWP 30372]
[New LWP 30373]
[New LWP 30374]
[New LWP 30375]
[New LWP 30376]
[New LWP 30377]
0x00007fe29a81bd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 30247 "kudu"  0x00007fe29a81bd50 in ?? ()
  2    LWP 30248 "kudu"  0x00007fe29a817fb9 in ?? ()
  3    LWP 30249 "kudu"  0x00007fe29a817fb9 in ?? ()
  4    LWP 30250 "kudu"  0x00007fe29a817fb9 in ?? ()
  5    LWP 30251 "kernel-watcher-" 0x00007fe29a817fb9 in ?? ()
  6    LWP 30257 "ntp client-3025" 0x00007fe29a81b9e2 in ?? ()
  7    LWP 30258 "file cache-evic" 0x00007fe29a817fb9 in ?? ()
  8    LWP 30259 "sq_acceptor" 0x00007fe29892ccb9 in ?? ()
  9    LWP 30262 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  10   LWP 30263 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  11   LWP 30264 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  12   LWP 30265 "rpc reactor-302" 0x00007fe298939a47 in ?? ()
  13   LWP 30266 "MaintenanceMgr " 0x00007fe29a817ad3 in ?? ()
  14   LWP 30267 "txn-status-mana" 0x00007fe29a817fb9 in ?? ()
  15   LWP 30269 "collect_and_rem" 0x00007fe29a817fb9 in ?? ()
  16   LWP 30270 "tc-session-exp-" 0x00007fe29a817fb9 in ?? ()
  17   LWP 30271 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  18   LWP 30272 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  19   LWP 30273 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  20   LWP 30274 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  21   LWP 30275 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  22   LWP 30276 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  23   LWP 30277 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  24   LWP 30278 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  25   LWP 30279 "rpc worker-3027" 0x00007fe29a817ad3 in ?? ()
  26   LWP 30280 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  27   LWP 30281 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  28   LWP 30282 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  29   LWP 30283 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  30   LWP 30284 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  31   LWP 30285 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  32   LWP 30286 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  33   LWP 30287 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  34   LWP 30288 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  35   LWP 30289 "rpc worker-3028" 0x00007fe29a817ad3 in ?? ()
  36   LWP 30290 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  37   LWP 30291 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  38   LWP 30292 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  39   LWP 30293 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  40   LWP 30294 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  41   LWP 30295 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  42   LWP 30296 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  43   LWP 30297 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  44   LWP 30298 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  45   LWP 30299 "rpc worker-3029" 0x00007fe29a817ad3 in ?? ()
  46   LWP 30300 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  47   LWP 30301 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  48   LWP 30302 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  49   LWP 30303 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  50   LWP 30304 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  51   LWP 30305 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  52   LWP 30306 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  53   LWP 30307 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  54   LWP 30308 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  55   LWP 30309 "rpc worker-3030" 0x00007fe29a817ad3 in ?? ()
  56   LWP 30310 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  57   LWP 30311 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  58   LWP 30312 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  59   LWP 30313 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  60   LWP 30314 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  61   LWP 30315 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  62   LWP 30316 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  63   LWP 30317 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  64   LWP 30318 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  65   LWP 30319 "rpc worker-3031" 0x00007fe29a817ad3 in ?? ()
  66   LWP 30320 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  67   LWP 30321 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  68   LWP 30322 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  69   LWP 30323 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  70   LWP 30324 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  71   LWP 30325 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  72   LWP 30326 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  73   LWP 30327 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  74   LWP 30328 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  75   LWP 30329 "rpc worker-3032" 0x00007fe29a817ad3 in ?? ()
  76   LWP 30330 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  77   LWP 30331 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  78   LWP 30332 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  79   LWP 30333 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  80   LWP 30334 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  81   LWP 30335 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  82   LWP 30336 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  83   LWP 30337 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  84   LWP 30338 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  85   LWP 30339 "rpc worker-3033" 0x00007fe29a817ad3 in ?? ()
  86   LWP 30340 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  87   LWP 30341 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  88   LWP 30342 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  89   LWP 30343 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  90   LWP 30344 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  91   LWP 30345 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  92   LWP 30346 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  93   LWP 30347 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  94   LWP 30348 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  95   LWP 30349 "rpc worker-3034" 0x00007fe29a817ad3 in ?? ()
  96   LWP 30350 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  97   LWP 30351 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  98   LWP 30352 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  99   LWP 30353 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  100  LWP 30354 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  101  LWP 30355 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  102  LWP 30356 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  103  LWP 30357 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  104  LWP 30358 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  105  LWP 30359 "rpc worker-3035" 0x00007fe29a817ad3 in ?? ()
  106  LWP 30360 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  107  LWP 30361 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  108  LWP 30362 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  109  LWP 30363 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  110  LWP 30364 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  111  LWP 30365 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  112  LWP 30366 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  113  LWP 30367 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  114  LWP 30368 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  115  LWP 30369 "rpc worker-3036" 0x00007fe29a817ad3 in ?? ()
  116  LWP 30370 "rpc worker-3037" 0x00007fe29a817ad3 in ?? ()
  117  LWP 30371 "diag-logger-303" 0x00007fe29a817fb9 in ?? ()
  118  LWP 30372 "result-tracker-" 0x00007fe29a817fb9 in ?? ()
  119  LWP 30373 "excess-log-dele" 0x00007fe29a817fb9 in ?? ()
  120  LWP 30374 "tcmalloc-memory" 0x00007fe29a817fb9 in ?? ()
  121  LWP 30375 "acceptor-30375" 0x00007fe29893b0c7 in ?? ()
  122  LWP 30376 "heartbeat-30376" 0x00007fe29a817fb9 in ?? ()
  123  LWP 30377 "maintenance_sch" 0x00007fe29a817fb9 in ?? ()

Thread 123 (LWP 30377):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000023 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438e13e50 in ?? ()
#5  0x00007fe2512c7470 in ?? ()
#6  0x0000000000000046 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 30376):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000000a in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x000055b438d63934 in ?? ()
#5  0x00007fe251ac83f0 in ?? ()
#6  0x0000000000000015 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fe251ac8410 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fe251ac8470 in ?? ()
#12 0x00007fe29a48b711 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 121 (LWP 30375):
#0  0x00007fe29893b0c7 in ?? ()
#1  0x00007fe2522c9020 in ?? ()
#2  0x00007fe29a49bec2 in ?? ()
#3  0x00007fe2522c9020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007fe2522c93e0 in ?? ()
#6  0x00007fe2522c9090 in ?? ()
#7  0x000055b438d0e4c8 in ?? ()
#8  0x00007fe29a4a1959 in ?? ()
#9  0x00007fe2522c9510 in ?? ()
#10 0x00007fe2522c9700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007fe29a81b3a7 in ?? ()
#13 0x00007fe2522ca520 in ?? ()
#14 0x00007fe2522c9260 in ?? ()
#15 0x000055b438dbf140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 30374):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffd3168eac0 in ?? ()
#5  0x00007fe252aca670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 30373):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 30372):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c97b70 in ?? ()
#5  0x00007fe253acc680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 30371):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b439010790 in ?? ()
#5  0x00007fe2542cd550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 30370):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000007 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b4390196bc in ?? ()
#4  0x00007fe254ace5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe254ace5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b4390196a8 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe254ace5e0 in ?? ()
#11 0x00007fe254ace640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 115 (LWP 30369):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b43901963c in ?? ()
#4  0x00007fe2552cf5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe2552cf5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b439019628 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe2552cf5e0 in ?? ()
#11 0x00007fe2552cf640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 114 (LWP 30368):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 30367):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 30366):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 30365):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 30364):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 30363):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 30362):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 30361):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 30360):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 30359):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 30358):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 30357):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 30356):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 30355):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 30354):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 30353):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 30352):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 30351):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 30350):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 30349):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 30348):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 30347):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 30346):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 30345):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 30344):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 30343):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 30342):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 30341):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 30340):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 30339):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 30338):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 30337):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 30336):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 30335):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 30334):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 30333):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 30332):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 30331):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 30330):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000285 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b4390180bc in ?? ()
#4  0x00007fe268af65c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe268af65e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b4390180a8 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe268af65e0 in ?? ()
#11 0x00007fe268af6640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 30329):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x00000000000002d2 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055b439018038 in ?? ()
#4  0x00007fe2692f75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe2692f75e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 74 (LWP 30328):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 30327):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 30326):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 30325):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 30324):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 30323):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 30322):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 30321):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 30320):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 30319):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 30318):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 30317):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 30316):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 30315):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 30314):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 30313):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 30312):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 30311):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 30310):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055b4390155b8 in ?? ()
#4  0x00007fe272b0a5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe272b0a5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 55 (LWP 30309):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 30308):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 30307):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 30306):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 30305):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 30304):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 30303):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 30302):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 30301):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 30300):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 30299):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 30298):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 30297):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 30296):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 30295):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 30294):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 30293):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 30292):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 30291):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 30290):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000004 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000055b439014c38 in ?? ()
#4  0x00007fe27cb1e5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe27cb1e5e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 30289):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000045 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x000055b439014bbc in ?? ()
#4  0x00007fe27d31f5c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fe27d31f5e0 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000055b439014ba8 in ?? ()
#9  0x00007fe29a817770 in ?? ()
#10 0x00007fe27d31f5e0 in ?? ()
#11 0x00007fe27d31f640 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 30288):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 30287):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 30286):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 30285):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 30284):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 30283):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 30282):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 30281):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 30280):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 30279):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 30278):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 30277):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 30276):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 30275):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 30274):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 30273):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 30272):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 30271):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 30270):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 30269):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7d6c8 in ?? ()
#5  0x00007fe2873336a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 30267):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 30266):
#0  0x00007fe29a817ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 30265):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe289337680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d75398 in ?? ()
#5  0x00007fe2893376c0 in ?? ()
#6  0x00007fe289337840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fb95f707de34000 in ?? ()
#10 0x000055b438d66c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d66c00 in ?? ()
#13 0x0000000038d75398 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe289337720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb95f707de34000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 30264):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe289b38680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d75018 in ?? ()
#5  0x00007fe289b386c0 in ?? ()
#6  0x00007fe289b38840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fb98e5282698000 in ?? ()
#10 0x000055b438d66680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d66680 in ?? ()
#13 0x0000000038d75018 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99c in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe289b38720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98e5282698000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 30263):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe28a339680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d75558 in ?? ()
#5  0x00007fe28a3396c0 in ?? ()
#6  0x00007fe28a339840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fa7f016a04c8000 in ?? ()
#10 0x000055b438d65600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d65600 in ?? ()
#13 0x0000000038d75558 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99b in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe28a339720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fa7f016a04c8000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 30262):
#0  0x00007fe298939a47 in ?? ()
#1  0x00007fe28bf1b680 in ?? ()
#2  0x00007fe293c3d571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000055b438d74e58 in ?? ()
#5  0x00007fe28bf1b6c0 in ?? ()
#6  0x00007fe28bf1b840 in ?? ()
#7  0x000055b438e1a670 in ?? ()
#8  0x00007fe293c3f25d in ?? ()
#9  0x3fb95279f55b8000 in ?? ()
#10 0x000055b438d65b80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000055b438d65b80 in ?? ()
#13 0x0000000038d74e58 in ?? ()
#14 0x000055b400000000 in ?? ()
#15 0x41da4f1fed70f99f in ?? ()
#16 0x000055b438e1a670 in ?? ()
#17 0x00007fe28bf1b720 in ?? ()
#18 0x00007fe293c43ba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 30259):
#0  0x00007fe29892ccb9 in ?? ()
#1  0x00007fe28d71e840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 30258):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 30257):
#0  0x00007fe29a81b9e2 in ?? ()
#1  0x000055b438c97ee0 in ?? ()
#2  0x00007fe28c71c4d0 in ?? ()
#3  0x00007fe28c71c450 in ?? ()
#4  0x00007fe28c71c570 in ?? ()
#5  0x00007fe28c71c790 in ?? ()
#6  0x00007fe28c71c7a0 in ?? ()
#7  0x00007fe28c71c4e0 in ?? ()
#8  0x00007fe28c71c4d0 in ?? ()
#9  0x000055b438c96350 in ?? ()
#10 0x00007fe29ac06c6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 30251):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x000000000000002d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438e1cdc8 in ?? ()
#5  0x00007fe28e720430 in ?? ()
#6  0x000000000000005a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 30250):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7c848 in ?? ()
#5  0x00007fe28ef21790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 30249):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7c2a8 in ?? ()
#5  0x00007fe28f722790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 30248):
#0  0x00007fe29a817fb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000055b438c7c188 in ?? ()
#5  0x00007fe28ff23790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 30247):
#0  0x00007fe29a81bd50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:35.770576 23994 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 3 with UUID 3d78cc34680848ddacf9620033efe712 and pid 30114
************************ BEGIN STACKS **************************
[New LWP 30117]
[New LWP 30118]
[New LWP 30119]
[New LWP 30120]
[New LWP 30126]
[New LWP 30127]
[New LWP 30128]
[New LWP 30131]
[New LWP 30132]
[New LWP 30133]
[New LWP 30134]
[New LWP 30135]
[New LWP 30136]
[New LWP 30137]
[New LWP 30138]
[New LWP 30139]
[New LWP 30140]
[New LWP 30141]
[New LWP 30142]
[New LWP 30143]
[New LWP 30144]
[New LWP 30145]
[New LWP 30146]
[New LWP 30147]
[New LWP 30148]
[New LWP 30149]
[New LWP 30150]
[New LWP 30151]
[New LWP 30152]
[New LWP 30153]
[New LWP 30154]
[New LWP 30155]
[New LWP 30156]
[New LWP 30157]
[New LWP 30158]
[New LWP 30159]
[New LWP 30160]
[New LWP 30161]
[New LWP 30162]
[New LWP 30163]
[New LWP 30164]
[New LWP 30165]
[New LWP 30166]
[New LWP 30167]
[New LWP 30168]
[New LWP 30169]
[New LWP 30170]
[New LWP 30171]
[New LWP 30172]
[New LWP 30173]
[New LWP 30174]
[New LWP 30175]
[New LWP 30176]
[New LWP 30177]
[New LWP 30178]
[New LWP 30179]
[New LWP 30180]
[New LWP 30181]
[New LWP 30182]
[New LWP 30183]
[New LWP 30184]
[New LWP 30185]
[New LWP 30186]
[New LWP 30187]
[New LWP 30188]
[New LWP 30189]
[New LWP 30190]
[New LWP 30191]
[New LWP 30192]
[New LWP 30193]
[New LWP 30194]
[New LWP 30195]
[New LWP 30196]
[New LWP 30197]
[New LWP 30198]
[New LWP 30199]
[New LWP 30200]
[New LWP 30201]
[New LWP 30202]
[New LWP 30203]
[New LWP 30204]
[New LWP 30205]
[New LWP 30206]
[New LWP 30207]
[New LWP 30208]
[New LWP 30209]
[New LWP 30210]
[New LWP 30211]
[New LWP 30212]
[New LWP 30213]
[New LWP 30214]
[New LWP 30215]
[New LWP 30216]
[New LWP 30217]
[New LWP 30218]
[New LWP 30219]
[New LWP 30220]
[New LWP 30221]
[New LWP 30222]
[New LWP 30223]
[New LWP 30224]
[New LWP 30225]
[New LWP 30226]
[New LWP 30227]
[New LWP 30228]
[New LWP 30229]
[New LWP 30230]
[New LWP 30231]
[New LWP 30232]
[New LWP 30233]
[New LWP 30234]
[New LWP 30235]
[New LWP 30236]
[New LWP 30237]
[New LWP 30238]
[New LWP 30239]
[New LWP 30240]
[New LWP 30241]
[New LWP 30242]
[New LWP 30243]
[New LWP 30244]
[New LWP 30245]
0x00007f92bd413d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 30114 "kudu"  0x00007f92bd413d50 in ?? ()
  2    LWP 30117 "kudu"  0x00007f92bd40ffb9 in ?? ()
  3    LWP 30118 "kudu"  0x00007f92bd40ffb9 in ?? ()
  4    LWP 30119 "kudu"  0x00007f92bd40ffb9 in ?? ()
  5    LWP 30120 "kernel-watcher-" 0x00007f92bd40ffb9 in ?? ()
  6    LWP 30126 "ntp client-3012" 0x00007f92bd4139e2 in ?? ()
  7    LWP 30127 "file cache-evic" 0x00007f92bd40ffb9 in ?? ()
  8    LWP 30128 "sq_acceptor" 0x00007f92bb524cb9 in ?? ()
  9    LWP 30131 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  10   LWP 30132 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  11   LWP 30133 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  12   LWP 30134 "rpc reactor-301" 0x00007f92bb531a47 in ?? ()
  13   LWP 30135 "MaintenanceMgr " 0x00007f92bd40fad3 in ?? ()
  14   LWP 30136 "txn-status-mana" 0x00007f92bd40ffb9 in ?? ()
  15   LWP 30137 "collect_and_rem" 0x00007f92bd40ffb9 in ?? ()
  16   LWP 30138 "tc-session-exp-" 0x00007f92bd40ffb9 in ?? ()
  17   LWP 30139 "rpc worker-3013" 0x00007f92bd40fad3 in ?? ()
  18   LWP 30140 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  19   LWP 30141 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  20   LWP 30142 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  21   LWP 30143 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  22   LWP 30144 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  23   LWP 30145 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  24   LWP 30146 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  25   LWP 30147 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  26   LWP 30148 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  27   LWP 30149 "rpc worker-3014" 0x00007f92bd40fad3 in ?? ()
  28   LWP 30150 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  29   LWP 30151 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  30   LWP 30152 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  31   LWP 30153 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  32   LWP 30154 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  33   LWP 30155 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  34   LWP 30156 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  35   LWP 30157 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  36   LWP 30158 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  37   LWP 30159 "rpc worker-3015" 0x00007f92bd40fad3 in ?? ()
  38   LWP 30160 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  39   LWP 30161 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  40   LWP 30162 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  41   LWP 30163 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  42   LWP 30164 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  43   LWP 30165 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  44   LWP 30166 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  45   LWP 30167 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  46   LWP 30168 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  47   LWP 30169 "rpc worker-3016" 0x00007f92bd40fad3 in ?? ()
  48   LWP 30170 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  49   LWP 30171 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  50   LWP 30172 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  51   LWP 30173 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  52   LWP 30174 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  53   LWP 30175 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  54   LWP 30176 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  55   LWP 30177 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  56   LWP 30178 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  57   LWP 30179 "rpc worker-3017" 0x00007f92bd40fad3 in ?? ()
  58   LWP 30180 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  59   LWP 30181 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  60   LWP 30182 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  61   LWP 30183 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  62   LWP 30184 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  63   LWP 30185 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  64   LWP 30186 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  65   LWP 30187 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  66   LWP 30188 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  67   LWP 30189 "rpc worker-3018" 0x00007f92bd40fad3 in ?? ()
  68   LWP 30190 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  69   LWP 30191 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  70   LWP 30192 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  71   LWP 30193 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  72   LWP 30194 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  73   LWP 30195 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  74   LWP 30196 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  75   LWP 30197 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  76   LWP 30198 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  77   LWP 30199 "rpc worker-3019" 0x00007f92bd40fad3 in ?? ()
  78   LWP 30200 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  79   LWP 30201 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  80   LWP 30202 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  81   LWP 30203 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  82   LWP 30204 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  83   LWP 30205 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  84   LWP 30206 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  85   LWP 30207 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  86   LWP 30208 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  87   LWP 30209 "rpc worker-3020" 0x00007f92bd40fad3 in ?? ()
  88   LWP 30210 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  89   LWP 30211 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  90   LWP 30212 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  91   LWP 30213 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  92   LWP 30214 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  93   LWP 30215 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  94   LWP 30216 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  95   LWP 30217 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  96   LWP 30218 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  97   LWP 30219 "rpc worker-3021" 0x00007f92bd40fad3 in ?? ()
  98   LWP 30220 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  99   LWP 30221 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  100  LWP 30222 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  101  LWP 30223 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  102  LWP 30224 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  103  LWP 30225 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  104  LWP 30226 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  105  LWP 30227 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  106  LWP 30228 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  107  LWP 30229 "rpc worker-3022" 0x00007f92bd40fad3 in ?? ()
  108  LWP 30230 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  109  LWP 30231 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  110  LWP 30232 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  111  LWP 30233 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  112  LWP 30234 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  113  LWP 30235 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  114  LWP 30236 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  115  LWP 30237 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  116  LWP 30238 "rpc worker-3023" 0x00007f92bd40fad3 in ?? ()
  117  LWP 30239 "diag-logger-302" 0x00007f92bd40ffb9 in ?? ()
  118  LWP 30240 "result-tracker-" 0x00007f92bd40ffb9 in ?? ()
  119  LWP 30241 "excess-log-dele" 0x00007f92bd40ffb9 in ?? ()
  120  LWP 30242 "tcmalloc-memory" 0x00007f92bd40ffb9 in ?? ()
  121  LWP 30243 "acceptor-30243" 0x00007f92bb5330c7 in ?? ()
  122  LWP 30244 "heartbeat-30244" 0x00007f92bd40ffb9 in ?? ()
  123  LWP 30245 "maintenance_sch" 0x00007f92bd40ffb9 in ?? ()

Thread 123 (LWP 30245):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000026 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cb8de50 in ?? ()
#5  0x00007f92746c0470 in ?? ()
#6  0x000000000000004c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 30244):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cadd930 in ?? ()
#5  0x00007f9274ec13f0 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 121 (LWP 30243):
#0  0x00007f92bb5330c7 in ?? ()
#1  0x00007f92756c2020 in ?? ()
#2  0x00007f92bd093ec2 in ?? ()
#3  0x00007f92756c2020 in ?? ()
#4  0x0000000000080800 in ?? ()
#5  0x00007f92756c23e0 in ?? ()
#6  0x00007f92756c2090 in ?? ()
#7  0x000056022ca884c8 in ?? ()
#8  0x00007f92bd099959 in ?? ()
#9  0x00007f92756c2510 in ?? ()
#10 0x00007f92756c2700 in ?? ()
#11 0x0000008000000000 in ?? ()
#12 0x00007f92bd4133a7 in ?? ()
#13 0x00007f92756c3520 in ?? ()
#14 0x00007f92756c2260 in ?? ()
#15 0x000056022cb39140 in ?? ()
#16 0x0000000000000000 in ?? ()

Thread 120 (LWP 30242):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffc5a0aa6b0 in ?? ()
#5  0x00007f9275ec3670 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 30241):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 118 (LWP 30240):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022ca11b70 in ?? ()
#5  0x00007f9276ec5680 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 30239):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000009 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cd1c690 in ?? ()
#5  0x00007f92776c6550 in ?? ()
#6  0x0000000000000012 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 116 (LWP 30238):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 30237):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 30236):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 30235):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 30234):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 30233):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 30232):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 30231):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 30230):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 30229):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 30228):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 30227):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 30226):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 30225):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 30224):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 30223):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 30222):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000006 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19db8 in ?? ()
#4  0x00007f927fed75c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f927fed75e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 99 (LWP 30221):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 30220):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19d38 in ?? ()
#4  0x00007f9280ed95c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f9280ed95e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 97 (LWP 30219):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 30218):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 30217):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 30216):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 30215):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 30214):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 30213):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 30212):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 30211):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 30210):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 30209):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 30208):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 30207):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 30206):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 30205):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 30204):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 30203):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 30202):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 30201):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 30200):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 30199):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 76 (LWP 30198):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 30197):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 30196):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 30195):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19b38 in ?? ()
#4  0x00007f928d6f25c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f928d6f25e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 72 (LWP 30194):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 30193):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 30192):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 30191):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 30190):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 30189):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 30188):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 30187):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 30186):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 30185):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 30184):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 30183):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 30182):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 30181):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 30180):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 30179):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 56 (LWP 30178):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 30177):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 30176):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 30175):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 30174):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 30173):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 30172):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 30171):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 30170):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 30169):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 30168):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 30167):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 30166):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 30165):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 30164):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 30163):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 30162):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 30161):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 30160):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19a38 in ?? ()
#4  0x00007f929ef155c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f929ef155e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 37 (LWP 30159):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 36 (LWP 30158):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 30157):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 30156):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 30155):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 30154):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 30153):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 30152):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 30151):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 30150):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 30149):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 30148):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 30147):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 30146):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 30145):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 30144):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 30143):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 30142):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 30141):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 30140):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x000056022cd19eb8 in ?? ()
#4  0x00007f92a8f295c0 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f92a8f295e0 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 17 (LWP 30139):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 16 (LWP 30138):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 15 (LWP 30137):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f76c8 in ?? ()
#5  0x00007f92aa72c6a0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 30136):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 13 (LWP 30135):
#0  0x00007f92bd40fad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 30134):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92abf2f680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caef398 in ?? ()
#5  0x00007f92abf2f6c0 in ?? ()
#6  0x00007f92abf2f840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb957e31e71c000 in ?? ()
#10 0x000056022cae0c00 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cae0c00 in ?? ()
#13 0x000000002caef398 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99e in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92abf2f720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb957e31e71c000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 11 (LWP 30133):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92ac730680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caef018 in ?? ()
#5  0x00007f92ac7306c0 in ?? ()
#6  0x00007f92ac730840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb98d125dd80000 in ?? ()
#10 0x000056022cadf600 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cadf600 in ?? ()
#13 0x000000002caef018 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99e in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92ac730720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98d125dd80000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 10 (LWP 30132):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92acf31680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caef558 in ?? ()
#5  0x00007f92acf316c0 in ?? ()
#6  0x00007f92acf31840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb98b3382e30000 in ?? ()
#10 0x000056022cadfb80 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cadfb80 in ?? ()
#13 0x000000002caef558 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99d in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92acf31720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x33b89289396ec936 in ?? ()
#20 0x3fb98b3382e30000 in ?? ()
#21 0x0000000000000000 in ?? ()

Thread 9 (LWP 30131):
#0  0x00007f92bb531a47 in ?? ()
#1  0x00007f92aeb13680 in ?? ()
#2  0x00007f92b6835571 in ?? ()
#3  0x00000000000002df in ?? ()
#4  0x000056022caeee58 in ?? ()
#5  0x00007f92aeb136c0 in ?? ()
#6  0x00007f92aeb13840 in ?? ()
#7  0x000056022cb94670 in ?? ()
#8  0x00007f92b683725d in ?? ()
#9  0x3fb95adf9f838000 in ?? ()
#10 0x000056022cae0680 in ?? ()
#11 0x42a2309ce5400000 in ?? ()
#12 0x000056022cae0680 in ?? ()
#13 0x000000002caeee58 in ?? ()
#14 0x0000560200000000 in ?? ()
#15 0x41da4f1fed70f99d in ?? ()
#16 0x000056022cb94670 in ?? ()
#17 0x00007f92aeb13720 in ?? ()
#18 0x00007f92b683bba3 in ?? ()
#19 0x0000000000000000 in ?? ()

Thread 8 (LWP 30128):
#0  0x00007f92bb524cb9 in ?? ()
#1  0x00007f92b0316840 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 7 (LWP 30127):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000000 in ?? ()

Thread 6 (LWP 30126):
#0  0x00007f92bd4139e2 in ?? ()
#1  0x000056022ca11ee0 in ?? ()
#2  0x00007f92af3144d0 in ?? ()
#3  0x00007f92af314450 in ?? ()
#4  0x00007f92af314570 in ?? ()
#5  0x00007f92af314790 in ?? ()
#6  0x00007f92af3147a0 in ?? ()
#7  0x00007f92af3144e0 in ?? ()
#8  0x00007f92af3144d0 in ?? ()
#9  0x000056022ca10350 in ?? ()
#10 0x00007f92bd7fec6f in ?? ()
#11 0x3ff0000000000000 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 5 (LWP 30120):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000030 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022cb96dc8 in ?? ()
#5  0x00007f92b1318430 in ?? ()
#6  0x0000000000000060 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 4 (LWP 30119):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f6848 in ?? ()
#5  0x00007f92b1b19790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 3 (LWP 30118):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f62a8 in ?? ()
#5  0x00007f92b231a790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 2 (LWP 30117):
#0  0x00007f92bd40ffb9 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000000000002 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x000056022c9f6188 in ?? ()
#5  0x00007f92b2b1b790 in ?? ()
#6  0x0000000000000004 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 1 (LWP 30114):
#0  0x00007f92bd413d50 in ?? ()
#1  0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20251212 21:11:36.280943 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29981
I20251212 21:11:36.291064 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 29846
I20251212 21:11:36.302075 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 30247
I20251212 21:11:36.312151 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 30114
I20251212 21:11:36.317694 23994 external_mini_cluster.cc:1658] Killing /tmp/dist-test-taskAaNqbA/build/release/bin/kudu with pid 25090
2025-12-12T21:11:36Z chronyd exiting
I20251212 21:11:36.334146 23994 test_util.cc:183] -----------------------------------------------
I20251212 21:11:36.334224 23994 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskAaNqbA/test-tmp/maintenance_mode-itest.0.RollingRestartArgs_RollingRestartITest.TestWorkloads_4.1765573831632761-23994-0
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-A0 B1-75 53-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-A0 B1-75 53-56 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-A2 B1-75 53-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00> (49474 ms)
[----------] 1 test from RollingRestartArgs/RollingRestartITest (49474 ms total)

[----------] Global test environment tear-down
[==========] 2 tests from 2 test suites ran. (64697 ms total)
[  PASSED  ] 1 test.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] RollingRestartArgs/RollingRestartITest.TestWorkloads/4, where GetParam() = 800-byte object <01-00 00-00 01-00 00-00 04-00 00-00 00-00 00-00 20-A0 B1-75 53-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 40-A0 B1-75 53-56 00-00 00-00 00-00 00-00 00-00 ... 00-00 00-00 00-00 00-00 F8-A2 B1-75 53-56 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-00 00-01 00-00 00-00 00-00 04-00 00-00 03-00 00-00 01-00 00-00 00-00 00-00>

 1 FAILED TEST
I20251212 21:11:36.334712 23994 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 16 messages since previous log ~10 seconds ago