Diagnosed failure

TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate: /home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:2151: Failure
Failed
Bad status: Timed out: Timed out waiting for number of WAL segments on tablet 50dded91450341c1bd9eeeb411f6be6c on TS 0 to be 6. Found 5
I20250626 01:59:00.045675 10490 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20250626 01:59:00.046149 10490 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 37c2019541f54a949ff92c6e59946308 and pid 11257
************************ BEGIN STACKS **************************
[New LWP 11258]
[New LWP 11259]
[New LWP 11260]
[New LWP 11261]
[New LWP 11262]
[New LWP 11269]
[New LWP 11270]
[New LWP 11271]
[New LWP 11274]
[New LWP 11275]
[New LWP 11276]
[New LWP 11277]
[New LWP 11278]
[New LWP 11279]
[New LWP 11280]
[New LWP 11281]
[New LWP 11282]
[New LWP 11283]
[New LWP 11284]
[New LWP 11285]
[New LWP 11286]
[New LWP 11287]
[New LWP 11288]
[New LWP 11289]
[New LWP 11290]
[New LWP 11291]
[New LWP 11292]
[New LWP 11293]
[New LWP 11294]
[New LWP 11295]
[New LWP 11296]
[New LWP 11297]
[New LWP 11298]
[New LWP 11299]
[New LWP 11300]
[New LWP 11301]
[New LWP 11302]
[New LWP 11303]
[New LWP 11304]
[New LWP 11305]
[New LWP 11306]
[New LWP 11307]
[New LWP 11308]
[New LWP 11309]
[New LWP 11310]
[New LWP 11311]
[New LWP 11312]
[New LWP 11313]
[New LWP 11314]
[New LWP 11315]
[New LWP 11316]
[New LWP 11317]
[New LWP 11318]
[New LWP 11319]
[New LWP 11320]
[New LWP 11321]
[New LWP 11322]
[New LWP 11323]
[New LWP 11324]
[New LWP 11325]
[New LWP 11326]
[New LWP 11327]
[New LWP 11328]
[New LWP 11329]
[New LWP 11330]
[New LWP 11331]
[New LWP 11332]
[New LWP 11333]
[New LWP 11334]
[New LWP 11335]
[New LWP 11336]
[New LWP 11337]
[New LWP 11338]
[New LWP 11339]
[New LWP 11340]
[New LWP 11341]
[New LWP 11342]
[New LWP 11343]
[New LWP 11344]
[New LWP 11345]
[New LWP 11346]
[New LWP 11347]
[New LWP 11348]
[New LWP 11349]
[New LWP 11350]
[New LWP 11351]
[New LWP 11352]
[New LWP 11353]
[New LWP 11354]
[New LWP 11355]
[New LWP 11356]
[New LWP 11357]
[New LWP 11358]
[New LWP 11359]
[New LWP 11360]
[New LWP 11361]
[New LWP 11362]
[New LWP 11363]
[New LWP 11364]
[New LWP 11365]
[New LWP 11366]
[New LWP 11367]
[New LWP 11368]
[New LWP 11369]
[New LWP 11370]
[New LWP 11371]
[New LWP 11372]
[New LWP 11373]
[New LWP 11374]
[New LWP 11375]
[New LWP 11376]
[New LWP 11377]
[New LWP 11378]
[New LWP 11379]
[New LWP 11380]
[New LWP 11381]
[New LWP 11382]
[New LWP 11383]
[New LWP 11384]
[New LWP 11385]
[New LWP 11386]
[New LWP 11387]
[New LWP 11864]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f267212dd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 11257 "kudu"  0x00007f267212dd50 in ?? ()
  2    LWP 11258 "kudu"  0x00007f266d4f17a0 in ?? ()
  3    LWP 11259 "kudu"  0x00007f2672129fb9 in ?? ()
  4    LWP 11260 "kudu"  0x00007f2672129fb9 in ?? ()
  5    LWP 11261 "kudu"  0x00007f2672129fb9 in ?? ()
  6    LWP 11262 "kernel-watcher-" 0x00007f2672129fb9 in ?? ()
  7    LWP 11269 "ntp client-1126" 0x00007f267212d9e2 in ?? ()
  8    LWP 11270 "file cache-evic" 0x00007f2672129fb9 in ?? ()
  9    LWP 11271 "sq_acceptor" 0x00007f266d521cb9 in ?? ()
  10   LWP 11274 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  11   LWP 11275 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  12   LWP 11276 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  13   LWP 11277 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  14   LWP 11278 "MaintenanceMgr " 0x00007f2672129ad3 in ?? ()
  15   LWP 11279 "txn-status-mana" 0x00007f2672129fb9 in ?? ()
  16   LWP 11280 "collect_and_rem" 0x00007f2672129fb9 in ?? ()
  17   LWP 11281 "tc-session-exp-" 0x00007f2672129fb9 in ?? ()
  18   LWP 11282 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  19   LWP 11283 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  20   LWP 11284 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  21   LWP 11285 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  22   LWP 11286 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  23   LWP 11287 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  24   LWP 11288 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  25   LWP 11289 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  26   LWP 11290 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  27   LWP 11291 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  28   LWP 11292 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  29   LWP 11293 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  30   LWP 11294 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  31   LWP 11295 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  32   LWP 11296 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  33   LWP 11297 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  34   LWP 11298 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  35   LWP 11299 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  36   LWP 11300 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  37   LWP 11301 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  38   LWP 11302 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  39   LWP 11303 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  40   LWP 11304 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  41   LWP 11305 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  42   LWP 11306 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  43   LWP 11307 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  44   LWP 11308 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  45   LWP 11309 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  46   LWP 11310 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  47   LWP 11311 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  48   LWP 11312 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  49   LWP 11313 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  50   LWP 11314 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  51   LWP 11315 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  52   LWP 11316 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  53   LWP 11317 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  54   LWP 11318 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  55   LWP 11319 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  56   LWP 11320 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  57   LWP 11321 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  58   LWP 11322 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  59   LWP 11323 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  60   LWP 11324 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  61   LWP 11325 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  62   LWP 11326 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  63   LWP 11327 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  64   LWP 11328 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  65   LWP 11329 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  66   LWP 11330 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  67   LWP 11331 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  68   LWP 11332 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  69   LWP 11333 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  70   LWP 11334 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  71   LWP 11335 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  72   LWP 11336 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  73   LWP 11337 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  74   LWP 11338 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  75   LWP 11339 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  76   LWP 11340 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  77   LWP 11341 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  78   LWP 11342 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  79   LWP 11343 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  80   LWP 11344 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  81   LWP 11345 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  82   LWP 11346 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  83   LWP 11347 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  84   LWP 11348 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  85   LWP 11349 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  86   LWP 11350 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  87   LWP 11351 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  88   LWP 11352 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  89   LWP 11353 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  90   LWP 11354 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  91   LWP 11355 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  92   LWP 11356 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  93   LWP 11357 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  94   LWP 11358 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  95   LWP 11359 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  96   LWP 11360 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  97   LWP 11361 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  98   LWP 11362 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  99   LWP 11363 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  100  LWP 11364 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  101  LWP 11365 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  102  LWP 11366 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  103  LWP 11367 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  104  LWP 11368 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  105  LWP 11369 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  106  LWP 11370 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  107  LWP 11371 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  108  LWP 11372 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  109  LWP 11373 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  110  LWP 11374 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  111  LWP 11375 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  112  LWP 11376 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  113  LWP 11377 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  114  LWP 11378 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  115  LWP 11379 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  116  LWP 11380 "rpc worker-1138" 0x00007f2672129ad3 in ?? ()
  117  LWP 11381 "rpc worker-1138" 0x00007f2672129ad3 in ?? ()
  118  LWP 11382 "diag-logger-113" 0x00007f2672129fb9 in ?? ()
  119  LWP 11383 "result-tracker-" 0x00007f2672129fb9 in ?? ()
  120  LWP 11384 "excess-log-dele" 0x00007f2672129fb9 in ?? ()
  121  LWP 11385 "acceptor-11385" 0x00007f266d5300c7 in ?? ()
  122  LWP 11386 "heartbeat-11386" 0x00007f2672129fb9 in ?? ()
  123  LWP 11387 "maintenance_sch" 0x00007f2672129fb9 in ?? ()
  124  LWP 11864 "raft [worker]-1" 0x00007f2672129fb9 in ?? ()

Thread 124 (LWP 11864):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 123 (LWP 11387):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f0100000000 in ?? ()
#2  0x000000000000010a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f26263b96c0 in ?? ()
#6  0x0000000000000214 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 11386):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 11385):
#0  0x00007f266d5300c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 11384):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f2627bbc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcc319ba00 in ?? ()
#5  0x00007f2627bbc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 11383):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000085352fb8 in ?? ()
#2  0x0000000000000042 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f26283bd800 in ?? ()
#6  0x0000000000000084 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 11382):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f266b488008 in ?? ()
#2  0x000000000000003e in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f2628bbe750 in ?? ()
#6  0x000000000000007c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 11381):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 11380):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 11379):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 11378):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 11377):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 11376):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 11375):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 11374):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 11373):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 11372):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 11371):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 11370):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 11369):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 11368):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 11367):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 11366):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 11365):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 11364):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 11363):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 11362):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 11361):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 11360):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 11359):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 11358):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 11357):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 11356):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 11355):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 11354):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 11353):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 11352):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 11351):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 11350):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 11349):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 11348):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 11347):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 11346):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 11345):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 11344):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 11343):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 11342):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 11341):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f263ddba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f263ddba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 11340):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 11339):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 11338):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 11337):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 11336):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 11335):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 11334):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 11333):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 11332):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 11331):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 11330):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 11329):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 11328):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 11327):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 11326):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 11325):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 11324):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 11323):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 11322):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 11321):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f26481bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f26481bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c8 in ?? ()
#9  0x00007f2672129770 in ?? ()
#10 0x00007f26481bc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 11320):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 11319):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 11318):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 11317):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 11316):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 11315):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 11314):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 11313):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 11312):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 11311):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 11310):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 11309):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 11308):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 11307):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 11306):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 11305):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 11304):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 11303):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 11302):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 11301):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000167 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f26525be710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f26525be730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f2672129770 in ?? ()
#10 0x00007f26525be730 in ?? ()
#11 0x00007f26321adca0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 11300):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000290 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b240005d7f8 in ?? ()
#4  0x00007f2652fb6710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2652fb6730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 11299):
#0  0x00007f2672129ad3 in ?? ()
#1  0x00000000000003dd in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b2400058ffc in ?? ()
#4  0x00007f26537b7710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f26537b7730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f2672129770 in ?? ()
#10 0x00007f26537b7730 in ?? ()
#11 0x00007f263217eaa0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 11298):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 11297):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 11296):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 11295):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 11294):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 11293):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 11292):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 11291):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 11290):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 11289):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 11288):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 11287):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 11286):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 11285):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 11284):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 11283):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 11282):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 11281):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000017a335f0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f265cb92700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 11280):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f265d3939a8 in ?? ()
#2  0x000000000000000d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400037198 in ?? ()
#5  0x00007f265d393840 in ?? ()
#6  0x000000000000001a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 11279):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f265db94410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 11278):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 11277):
#0  0x00007f266d52ea47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 11276):
#0  0x00007f266d52ea47 in ?? ()
#1  0x00007b280002a128 in ?? ()
#2  0x0044e000029c2742 in ?? ()
#3  0x00007f265f397500 in ?? ()
#4  0x00007f265f398b80 in ?? ()
#5  0x00007f265f397500 in ?? ()
#6  0x0000000000000011 in ?? ()
#7  0x00007b5800001800 in ?? ()
#8  0x0000000000488695 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f266aca2000 in ?? ()
#10 0x0000000000488599 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007f265f398b80 in ?? ()
#12 0x00007f266ff87069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007f26757081a0 in ?? ()
#15 0x00007b4c00002f90 in ?? ()
#16 0x00007b4c00002f98 in ?? ()
#17 0x00007f265f3977a0 in ?? ()
#18 0x00007b4400033d00 in ?? ()
#19 0x00007f265f397cd0 in ?? ()
#20 0x0000000000000000 in ?? ()

Thread 11 (LWP 11275):
#0  0x00007f266d52ea47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 11274):
#0  0x00007f266d52ea47 in ?? ()
#1  0x00007b5800010408 in ?? ()
#2  0x003ce00001c280a3 in ?? ()
#3  0x00007f2662bbe500 in ?? ()
#4  0x00007f2662bbfb80 in ?? ()
#5  0x00007f2662bbe500 in ?? ()
#6  0x000000000000000d in ?? ()
#7  0x00007b5800000f00 in ?? ()
#8  0x0000000000488695 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f266acc6000 in ?? ()
#10 0x0000000000488599 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007f2662bbfb80 in ?? ()
#12 0x00007f266ff87069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007f26757081a0 in ?? ()
#15 0x00007b4c00002c10 in ?? ()
#16 0x00007b4c00002c18 in ?? ()
#17 0x00007f2662bbe7a0 in ?? ()
#18 0x00007b4400036a00 in ?? ()
#19 0x00007f2662bbecd0 in ?? ()
#20 0x0000000000000000 in ?? ()

Thread 9 (LWP 11271):
#0  0x00007f266d521cb9 in ?? ()
#1  0x00007f26663bcc10 in ?? ()
#2  0x00007b040000a860 in ?? ()
#3  0x00007f26663bdb80 in ?? ()
#4  0x00007f26663bcc10 in ?? ()
#5  0x00007b040000a860 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f266ae5a000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f26663bdb80 in ?? ()
#10 0x00007f2676f06b28 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 11270):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 7 (LWP 11269):
#0  0x00007f267212d9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 11262):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f26673bea40 in ?? ()
#2  0x0000000000000154 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000361d8 in ?? ()
#5  0x00007f26673be5d0 in ?? ()
#6  0x00000000000002a8 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 11261):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 11260):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 11259):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 11258):
#0  0x00007f266d4f17a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 11257):
#0  0x00007f267212dd50 in ?? ()
#1  0x00007ffcc319b870 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f266c74fcc0 in ?? ()
#4  0x00007f266c74fcc0 in ?? ()
#5  0x00007ffcc319b810 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0xffffffff00a96d0f in ?? ()
#9  0x00007f266c74fcc0 in ?? ()
#10 0x00007f2670655f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250626 01:59:01.247025 10490 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID f990919f780a49e39097b5e2cd933557 and pid 11390
************************ BEGIN STACKS **************************
[New LWP 11391]
[New LWP 11392]
[New LWP 11393]
[New LWP 11394]
[New LWP 11395]
[New LWP 11403]
[New LWP 11404]
[New LWP 11405]
[New LWP 11408]
[New LWP 11409]
[New LWP 11410]
[New LWP 11411]
[New LWP 11412]
[New LWP 11413]
[New LWP 11414]
[New LWP 11415]
[New LWP 11416]
[New LWP 11417]
[New LWP 11418]
[New LWP 11419]
[New LWP 11420]
[New LWP 11421]
[New LWP 11422]
[New LWP 11423]
[New LWP 11424]
[New LWP 11425]
[New LWP 11426]
[New LWP 11427]
[New LWP 11428]
[New LWP 11429]
[New LWP 11430]
[New LWP 11431]
[New LWP 11432]
[New LWP 11433]
[New LWP 11434]
[New LWP 11435]
[New LWP 11436]
[New LWP 11437]
[New LWP 11438]
[New LWP 11439]
[New LWP 11440]
[New LWP 11441]
[New LWP 11442]
[New LWP 11443]
[New LWP 11444]
[New LWP 11445]
[New LWP 11446]
[New LWP 11447]
[New LWP 11448]
[New LWP 11449]
[New LWP 11450]
[New LWP 11451]
[New LWP 11452]
[New LWP 11453]
[New LWP 11454]
[New LWP 11455]
[New LWP 11456]
[New LWP 11457]
[New LWP 11458]
[New LWP 11459]
[New LWP 11460]
[New LWP 11461]
[New LWP 11462]
[New LWP 11463]
[New LWP 11464]
[New LWP 11465]
[New LWP 11466]
[New LWP 11467]
[New LWP 11468]
[New LWP 11469]
[New LWP 11470]
[New LWP 11471]
[New LWP 11472]
[New LWP 11473]
[New LWP 11474]
[New LWP 11475]
[New LWP 11476]
[New LWP 11477]
[New LWP 11478]
[New LWP 11479]
[New LWP 11480]
[New LWP 11481]
[New LWP 11482]
[New LWP 11483]
[New LWP 11484]
[New LWP 11485]
[New LWP 11486]
[New LWP 11487]
[New LWP 11488]
[New LWP 11489]
[New LWP 11490]
[New LWP 11491]
[New LWP 11492]
[New LWP 11493]
[New LWP 11494]
[New LWP 11495]
[New LWP 11496]
[New LWP 11497]
[New LWP 11498]
[New LWP 11499]
[New LWP 11500]
[New LWP 11501]
[New LWP 11502]
[New LWP 11503]
[New LWP 11504]
[New LWP 11505]
[New LWP 11506]
[New LWP 11507]
[New LWP 11508]
[New LWP 11509]
[New LWP 11510]
[New LWP 11511]
[New LWP 11512]
[New LWP 11513]
[New LWP 11514]
[New LWP 11515]
[New LWP 11516]
[New LWP 11517]
[New LWP 11518]
[New LWP 11519]
[New LWP 11520]
[New LWP 11521]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f7923b56d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 11390 "kudu"  0x00007f7923b56d50 in ?? ()
  2    LWP 11391 "kudu"  0x00007f791ef1a7a0 in ?? ()
  3    LWP 11392 "kudu"  0x00007f7923b52fb9 in ?? ()
  4    LWP 11393 "kudu"  0x00007f7923b52fb9 in ?? ()
  5    LWP 11394 "kudu"  0x00007f7923b52fb9 in ?? ()
  6    LWP 11395 "kernel-watcher-" 0x00007f7923b52fb9 in ?? ()
  7    LWP 11403 "ntp client-1140" 0x00007f7923b569e2 in ?? ()
  8    LWP 11404 "file cache-evic" 0x00007f7923b52fb9 in ?? ()
  9    LWP 11405 "sq_acceptor" 0x00007f791ef4acb9 in ?? ()
  10   LWP 11408 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  11   LWP 11409 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  12   LWP 11410 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  13   LWP 11411 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  14   LWP 11412 "MaintenanceMgr " 0x00007f7923b52ad3 in ?? ()
  15   LWP 11413 "txn-status-mana" 0x00007f7923b52fb9 in ?? ()
  16   LWP 11414 "collect_and_rem" 0x00007f7923b52fb9 in ?? ()
  17   LWP 11415 "tc-session-exp-" 0x00007f7923b52fb9 in ?? ()
  18   LWP 11416 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  19   LWP 11417 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  20   LWP 11418 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  21   LWP 11419 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  22   LWP 11420 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  23   LWP 11421 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  24   LWP 11422 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  25   LWP 11423 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  26   LWP 11424 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  27   LWP 11425 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  28   LWP 11426 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  29   LWP 11427 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  30   LWP 11428 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  31   LWP 11429 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  32   LWP 11430 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  33   LWP 11431 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  34   LWP 11432 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  35   LWP 11433 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  36   LWP 11434 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  37   LWP 11435 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  38   LWP 11436 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  39   LWP 11437 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  40   LWP 11438 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  41   LWP 11439 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  42   LWP 11440 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  43   LWP 11441 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  44   LWP 11442 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  45   LWP 11443 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  46   LWP 11444 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  47   LWP 11445 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  48   LWP 11446 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  49   LWP 11447 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  50   LWP 11448 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  51   LWP 11449 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  52   LWP 11450 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  53   LWP 11451 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  54   LWP 11452 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  55   LWP 11453 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  56   LWP 11454 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  57   LWP 11455 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  58   LWP 11456 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  59   LWP 11457 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  60   LWP 11458 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  61   LWP 11459 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  62   LWP 11460 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  63   LWP 11461 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  64   LWP 11462 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  65   LWP 11463 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  66   LWP 11464 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  67   LWP 11465 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  68   LWP 11466 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  69   LWP 11467 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  70   LWP 11468 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  71   LWP 11469 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  72   LWP 11470 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  73   LWP 11471 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  74   LWP 11472 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  75   LWP 11473 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  76   LWP 11474 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  77   LWP 11475 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  78   LWP 11476 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  79   LWP 11477 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  80   LWP 11478 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  81   LWP 11479 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  82   LWP 11480 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  83   LWP 11481 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  84   LWP 11482 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  85   LWP 11483 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  86   LWP 11484 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  87   LWP 11485 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  88   LWP 11486 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  89   LWP 11487 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  90   LWP 11488 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  91   LWP 11489 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  92   LWP 11490 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  93   LWP 11491 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  94   LWP 11492 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  95   LWP 11493 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  96   LWP 11494 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  97   LWP 11495 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  98   LWP 11496 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  99   LWP 11497 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  100  LWP 11498 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  101  LWP 11499 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  102  LWP 11500 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  103  LWP 11501 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  104  LWP 11502 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  105  LWP 11503 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  106  LWP 11504 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  107  LWP 11505 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  108  LWP 11506 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  109  LWP 11507 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  110  LWP 11508 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  111  LWP 11509 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  112  LWP 11510 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  113  LWP 11511 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  114  LWP 11512 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  115  LWP 11513 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  116  LWP 11514 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  117  LWP 11515 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  118  LWP 11516 "diag-logger-115" 0x00007f7923b52fb9 in ?? ()
  119  LWP 11517 "result-tracker-" 0x00007f7923b52fb9 in ?? ()
  120  LWP 11518 "excess-log-dele" 0x00007f7923b52fb9 in ?? ()
  121  LWP 11519 "acceptor-11519" 0x00007f791ef590c7 in ?? ()
  122  LWP 11520 "heartbeat-11520" 0x00007f7923b52fb9 in ?? ()
  123  LWP 11521 "maintenance_sch" 0x00007f7923b52fb9 in ?? ()

Thread 123 (LWP 11521):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007b0100000000 in ?? ()
#2  0x0000000000000104 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f78d7db96c0 in ?? ()
#6  0x0000000000000208 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 11520):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 11519):
#0  0x00007f791ef590c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 11518):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f78d95bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007fff184b78a0 in ?? ()
#5  0x00007f78d95bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 11517):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000085352fb8 in ?? ()
#2  0x0000000000000041 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f78d9dbd800 in ?? ()
#6  0x0000000000000082 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 11516):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f791ce88008 in ?? ()
#2  0x0000000000000041 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f78da5be750 in ?? ()
#6  0x0000000000000082 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 11515):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 11514):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 11513):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 11512):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 11511):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 11510):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 11509):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 11508):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 11507):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 11506):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 11505):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 11504):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 11503):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 11502):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 11501):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 11500):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 11499):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 11498):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 11497):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 11496):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 11495):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 11494):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 11493):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 11492):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 11491):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 11490):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 11489):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 11488):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 11487):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 11486):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 11485):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 11484):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 11483):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 11482):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 11481):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 11480):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 11479):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 11478):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 11477):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 11476):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 11475):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000772 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f78ef7ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f78ef7ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 11474):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x00000000000007b1 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f78effbb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f78effbb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f7923b52770 in ?? ()
#10 0x00007f78effbb730 in ?? ()
#11 0x00007f78d39d4680 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 11473):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 11472):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 11471):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 11470):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 11469):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 11468):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 11467):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 11466):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 11465):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 11464):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 11463):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 11462):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 11461):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 11460):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 11459):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 11458):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 11457):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 11456):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 11455):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f78f9bbc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f78f9bbc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c8 in ?? ()
#9  0x00007f7923b52770 in ?? ()
#10 0x00007f78f9bbc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 11454):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 11453):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 11452):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 11451):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 11450):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 11449):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 11448):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 11447):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 11446):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 11445):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 11444):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 11443):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 11442):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 11441):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 11440):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 11439):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 11438):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 11437):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 11436):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 11435):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f7903fbe710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f7903fbe730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f7923b52770 in ?? ()
#10 0x00007f7903fbe730 in ?? ()
#11 0x00007f791bf9fc58 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 11434):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 11433):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 11432):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 11431):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 11430):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 11429):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 11428):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 11427):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 11426):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 11425):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 11424):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 11423):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 11422):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 11421):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 11420):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 11419):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 11418):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 11417):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 11416):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 11415):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000017a335f0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f790e592700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 11414):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f790ed939a8 in ?? ()
#2  0x000000000000000d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400037198 in ?? ()
#5  0x00007f790ed93840 in ?? ()
#6  0x000000000000001a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 11413):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f790f594410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 11412):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 11411):
#0  0x00007f791ef57a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 11410):
#0  0x00007f791ef57a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 11409):
#0  0x00007f791ef57a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 11408):
#0  0x00007f791ef57a47 in ?? ()
#1  0x00007b5800010108 in ?? ()
#2  0x003ce00001950e9c in ?? ()
#3  0x00007f79145be500 in ?? ()
#4  0x00007f79145bfb80 in ?? ()
#5  0x00007f79145be500 in ?? ()
#6  0x000000000000000d in ?? ()
#7  0x00007b5800000f00 in ?? ()
#8  0x0000000000488695 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f791c800000 in ?? ()
#10 0x0000000000488599 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007f79145bfb80 in ?? ()
#12 0x00007f79219b0069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007f79271311a0 in ?? ()
#15 0x00007b4c00002c10 in ?? ()
#16 0x00007b4c00002c18 in ?? ()
#17 0x00007f79145be7a0 in ?? ()
#18 0x00007b4400036a00 in ?? ()
#19 0x00007f79145becd0 in ?? ()
#20 0x0000000000000000 in ?? ()

Thread 9 (LWP 11405):
#0  0x00007f791ef4acb9 in ?? ()
#1  0x00007f7917dbcc10 in ?? ()
#2  0x00007b0400009010 in ?? ()
#3  0x00007f7917dbdb80 in ?? ()
#4  0x00007f7917dbcc10 in ?? ()
#5  0x00007b0400009010 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f791c88e000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f7917dbdb80 in ?? ()
#10 0x00007f792892fb28 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 11404):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 7 (LWP 11403):
#0  0x00007f7923b569e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 11395):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f7918dbea40 in ?? ()
#2  0x000000000000014f in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000361d8 in ?? ()
#5  0x00007f7918dbe5d0 in ?? ()
#6  0x000000000000029e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 11394):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 11393):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 11392):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 11391):
#0  0x00007f791ef1a7a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 11390):
#0  0x00007f7923b56d50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f791e178cc0 in ?? ()
#4  0x00007f791e178cc0 in ?? ()
#5  0x00007fff184b76b0 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a94dc5 in ?? ()
#9  0x00007f791e178cc0 in ?? ()
#10 0x00007f792207ef0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250626 01:59:02.378561 10490 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID c1be94bb90e44876a3647fd124cf6adf and pid 11524
************************ BEGIN STACKS **************************
[New LWP 11525]
[New LWP 11526]
[New LWP 11527]
[New LWP 11528]
[New LWP 11529]
[New LWP 11536]
[New LWP 11537]
[New LWP 11538]
[New LWP 11541]
[New LWP 11542]
[New LWP 11543]
[New LWP 11544]
[New LWP 11545]
[New LWP 11546]
[New LWP 11547]
[New LWP 11548]
[New LWP 11549]
[New LWP 11550]
[New LWP 11551]
[New LWP 11552]
[New LWP 11553]
[New LWP 11554]
[New LWP 11555]
[New LWP 11556]
[New LWP 11557]
[New LWP 11558]
[New LWP 11559]
[New LWP 11560]
[New LWP 11561]
[New LWP 11562]
[New LWP 11563]
[New LWP 11564]
[New LWP 11565]
[New LWP 11566]
[New LWP 11567]
[New LWP 11568]
[New LWP 11569]
[New LWP 11570]
[New LWP 11571]
[New LWP 11572]
[New LWP 11573]
[New LWP 11574]
[New LWP 11575]
[New LWP 11576]
[New LWP 11577]
[New LWP 11578]
[New LWP 11579]
[New LWP 11580]
[New LWP 11581]
[New LWP 11582]
[New LWP 11583]
[New LWP 11584]
[New LWP 11585]
[New LWP 11586]
[New LWP 11587]
[New LWP 11588]
[New LWP 11589]
[New LWP 11590]
[New LWP 11591]
[New LWP 11592]
[New LWP 11593]
[New LWP 11594]
[New LWP 11595]
[New LWP 11596]
[New LWP 11597]
[New LWP 11598]
[New LWP 11599]
[New LWP 11600]
[New LWP 11601]
[New LWP 11602]
[New LWP 11603]
[New LWP 11604]
[New LWP 11605]
[New LWP 11606]
[New LWP 11607]
[New LWP 11608]
[New LWP 11609]
[New LWP 11610]
[New LWP 11611]
[New LWP 11612]
[New LWP 11613]
[New LWP 11614]
[New LWP 11615]
[New LWP 11616]
[New LWP 11617]
[New LWP 11618]
[New LWP 11619]
[New LWP 11620]
[New LWP 11621]
[New LWP 11622]
[New LWP 11623]
[New LWP 11624]
[New LWP 11625]
[New LWP 11626]
[New LWP 11627]
[New LWP 11628]
[New LWP 11629]
[New LWP 11630]
[New LWP 11631]
[New LWP 11632]
[New LWP 11633]
[New LWP 11634]
[New LWP 11635]
[New LWP 11636]
[New LWP 11637]
[New LWP 11638]
[New LWP 11639]
[New LWP 11640]
[New LWP 11641]
[New LWP 11642]
[New LWP 11643]
[New LWP 11644]
[New LWP 11645]
[New LWP 11646]
[New LWP 11647]
[New LWP 11648]
[New LWP 11649]
[New LWP 11650]
[New LWP 11651]
[New LWP 11652]
[New LWP 11653]
[New LWP 11654]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f4b9bb11d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 11524 "kudu"  0x00007f4b9bb11d50 in ?? ()
  2    LWP 11525 "kudu"  0x00007f4b96ed57a0 in ?? ()
  3    LWP 11526 "kudu"  0x00007f4b9bb0dfb9 in ?? ()
  4    LWP 11527 "kudu"  0x00007f4b9bb0dfb9 in ?? ()
  5    LWP 11528 "kudu"  0x00007f4b9bb0dfb9 in ?? ()
  6    LWP 11529 "kernel-watcher-" 0x00007f4b9bb0dfb9 in ?? ()
  7    LWP 11536 "ntp client-1153" 0x00007f4b9bb119e2 in ?? ()
  8    LWP 11537 "file cache-evic" 0x00007f4b9bb0dfb9 in ?? ()
  9    LWP 11538 "sq_acceptor" 0x00007f4b96f05cb9 in ?? ()
  10   LWP 11541 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  11   LWP 11542 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  12   LWP 11543 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  13   LWP 11544 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  14   LWP 11545 "MaintenanceMgr " 0x00007f4b9bb0dad3 in ?? ()
  15   LWP 11546 "txn-status-mana" 0x00007f4b9bb0dfb9 in ?? ()
  16   LWP 11547 "collect_and_rem" 0x00007f4b9bb0dfb9 in ?? ()
  17   LWP 11548 "tc-session-exp-" 0x00007f4b9bb0dfb9 in ?? ()
  18   LWP 11549 "rpc worker-1154" 0x00007f4b9bb0dad3 in ?? ()
  19   LWP 11550 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  20   LWP 11551 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  21   LWP 11552 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  22   LWP 11553 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  23   LWP 11554 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  24   LWP 11555 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  25   LWP 11556 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  26   LWP 11557 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  27   LWP 11558 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  28   LWP 11559 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  29   LWP 11560 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  30   LWP 11561 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  31   LWP 11562 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  32   LWP 11563 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  33   LWP 11564 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  34   LWP 11565 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  35   LWP 11566 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  36   LWP 11567 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  37   LWP 11568 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  38   LWP 11569 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  39   LWP 11570 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  40   LWP 11571 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  41   LWP 11572 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  42   LWP 11573 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  43   LWP 11574 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  44   LWP 11575 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  45   LWP 11576 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  46   LWP 11577 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  47   LWP 11578 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  48   LWP 11579 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  49   LWP 11580 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  50   LWP 11581 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  51   LWP 11582 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  52   LWP 11583 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  53   LWP 11584 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  54   LWP 11585 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  55   LWP 11586 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  56   LWP 11587 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  57   LWP 11588 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  58   LWP 11589 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  59   LWP 11590 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  60   LWP 11591 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  61   LWP 11592 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  62   LWP 11593 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  63   LWP 11594 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  64   LWP 11595 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  65   LWP 11596 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  66   LWP 11597 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  67   LWP 11598 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  68   LWP 11599 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  69   LWP 11600 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  70   LWP 11601 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  71   LWP 11602 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  72   LWP 11603 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  73   LWP 11604 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  74   LWP 11605 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  75   LWP 11606 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  76   LWP 11607 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  77   LWP 11608 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  78   LWP 11609 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  79   LWP 11610 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  80   LWP 11611 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  81   LWP 11612 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  82   LWP 11613 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  83   LWP 11614 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  84   LWP 11615 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  85   LWP 11616 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  86   LWP 11617 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  87   LWP 11618 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  88   LWP 11619 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  89   LWP 11620 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  90   LWP 11621 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  91   LWP 11622 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  92   LWP 11623 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  93   LWP 11624 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  94   LWP 11625 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  95   LWP 11626 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  96   LWP 11627 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  97   LWP 11628 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  98   LWP 11629 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  99   LWP 11630 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  100  LWP 11631 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  101  LWP 11632 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  102  LWP 11633 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  103  LWP 11634 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  104  LWP 11635 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  105  LWP 11636 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  106  LWP 11637 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  107  LWP 11638 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  108  LWP 11639 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  109  LWP 11640 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  110  LWP 11641 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  111  LWP 11642 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  112  LWP 11643 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  113  LWP 11644 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  114  LWP 11645 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  115  LWP 11646 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  116  LWP 11647 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  117  LWP 11648 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  118  LWP 11649 "diag-logger-116" 0x00007f4b9bb0dfb9 in ?? ()
  119  LWP 11650 "result-tracker-" 0x00007f4b9bb0dfb9 in ?? ()
  120  LWP 11651 "excess-log-dele" 0x00007f4b9bb0dfb9 in ?? ()
  121  LWP 11652 "acceptor-11652" 0x00007f4b96f140c7 in ?? ()
  122  LWP 11653 "heartbeat-11653" 0x00007f4b9bb0dfb9 in ?? ()
  123  LWP 11654 "maintenance_sch" 0x00007f4b9bb0dfb9 in ?? ()

Thread 123 (LWP 11654):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007b0100000000 in ?? ()
#2  0x0000000000000101 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f4b4fdb96c0 in ?? ()
#6  0x0000000000000202 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 11653):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 11652):
#0  0x00007f4b96f140c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 11651):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b515bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe91554e90 in ?? ()
#5  0x00007f4b515bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 11650):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000085352fb8 in ?? ()
#2  0x0000000000000040 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f4b51dbd800 in ?? ()
#6  0x0000000000000080 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 11649):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b94e88008 in ?? ()
#2  0x000000000000003c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f4b525be750 in ?? ()
#6  0x0000000000000078 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 11648):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 11647):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 11646):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 11645):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 11644):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 11643):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 11642):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 11641):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 11640):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 11639):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 11638):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 11637):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 11636):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 11635):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 11634):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 11633):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 11632):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 11631):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 11630):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 11629):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 11628):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 11627):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 11626):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 11625):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 11624):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 11623):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 11622):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 11621):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 11620):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 11619):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 11618):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 11617):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 11616):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 11615):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 11614):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 11613):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 11612):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 11611):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 11610):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 11609):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 11608):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x000000000000085c in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f4b677ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b677ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 11607):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x00000000000005d1 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f4b67fbb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b67fbb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 75 (LWP 11606):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x00000000000000fb in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010d7dc in ?? ()
#4  0x00007f4b687bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b687bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f4b9bb0d770 in ?? ()
#10 0x00007f4b687bc730 in ?? ()
#11 0x00007f4b4cb05e78 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 74 (LWP 11605):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 11604):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 11603):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 11602):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 11601):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 11600):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 11599):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 11598):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 11597):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 11596):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 11595):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 11594):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 11593):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 11592):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 11591):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 11590):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 11589):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 11588):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f4b71bbc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b71bbc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 56 (LWP 11587):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 11586):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 11585):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 11584):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 11583):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 11582):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 11581):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 11580):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 11579):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 11578):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 11577):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 11576):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 11575):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 11574):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 11573):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 11572):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 11571):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 11570):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 11569):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 11568):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f4b7bfbe710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b7bfbe730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f4b9bb0d770 in ?? ()
#10 0x00007f4b7bfbe730 in ?? ()
#11 0x00007f4b93f69c50 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 11567):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 11566):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 11565):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 11564):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 11563):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 11562):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 11561):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 11560):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 11559):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 11558):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 11557):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 11556):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 11555):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 11554):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 11553):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 11552):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 11551):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 11550):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 11549):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 11548):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000017a335f0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f4b86592700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 11547):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b86d939a8 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400037198 in ?? ()
#5  0x00007f4b86d93840 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 11546):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f4b87594410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 11545):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 11544):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 11543):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 11542):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 11541):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 11538):
#0  0x00007f4b96f05cb9 in ?? ()
#1  0x00007f4b8fdbcc10 in ?? ()
#2  0x00007b040000a860 in ?? ()
#3  0x00007f4b8fdbdb80 in ?? ()
#4  0x00007f4b8fdbcc10 in ?? ()
#5  0x00007b040000a860 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f4b9483a000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f4b8fdbdb80 in ?? ()
#10 0x00007f4ba08eab28 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 11537):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000600000000000 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400034018 in ?? ()
#5  0x00007f4b8f5bb7f0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 7 (LWP 11536):
#0  0x00007f4b9bb119e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 11529):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b90dbea40 in ?? ()
#2  0x0000000000000147 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000361d8 in ?? ()
#5  0x00007f4b90dbe5d0 in ?? ()
#6  0x000000000000028e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 11528):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 11527):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 11526):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 11525):
#0  0x00007f4b96ed57a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 11524):
#0  0x00007f4b9bb11d50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f4b96133cc0 in ?? ()
#4  0x00007f4b96133cc0 in ?? ()
#5  0x00007ffe91554ca0 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a9adcd in ?? ()
#9  0x00007f4b96133cc0 in ?? ()
#10 0x00007f4b9a039f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250626 01:59:03.511559 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11257
I20250626 01:59:03.570781 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11390
I20250626 01:59:03.631290 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11524
I20250626 01:59:03.693063 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11165
2025-06-26T01:59:03Z chronyd exiting
I20250626 01:59:03.763218 10490 test_util.cc:183] -----------------------------------------------
I20250626 01:59:03.763502 10490 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0

Full log

Note: This is test shard 1 of 6.
[==========] Running 5 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 4 tests from TabletCopyITest
[ RUN      ] TabletCopyITest.TestRejectRogueLeader
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:172: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[  SKIPPED ] TabletCopyITest.TestRejectRogueLeader (12 ms)
[ RUN      ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:727: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[  SKIPPED ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest (7 ms)
[ RUN      ] TabletCopyITest.TestTabletCopyThrottling
2025-06-26T01:57:25Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-26T01:57:25Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20250626 01:57:25.038558 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.10.62.190:45805
--webserver_interface=127.10.62.190
--webserver_port=0
--builtin_ntp_servers=127.10.62.148:39413
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.10.62.190:45805
--master_tombstone_evicted_tablet_replicas=false with env {}
W20250626 01:57:25.392907 10499 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:25.393599 10499 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:25.394052 10499 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:25.429703 10499 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250626 01:57:25.430063 10499 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:25.430289 10499 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250626 01:57:25.430495 10499 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250626 01:57:25.469491 10499 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:39413
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
--master_tombstone_evicted_tablet_replicas=false
--ipki_ca_key_size=768
--master_addresses=127.10.62.190:45805
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.10.62.190:45805
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.10.62.190
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:25.471295 10499 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:25.473351 10499 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:25.491526 10506 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:25.492621 10505 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:25.495136 10499 server_base.cc:1048] running on GCE node
W20250626 01:57:25.492995 10508 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:26.700888 10499 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:26.704895 10499 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:26.706452 10499 hybrid_clock.cc:648] HybridClock initialized: now 1750903046706423 us; error 66 us; skew 500 ppm
I20250626 01:57:26.707484 10499 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:26.719552 10499 webserver.cc:469] Webserver started at http://127.10.62.190:43985/ using document root <none> and password file <none>
I20250626 01:57:26.720764 10499 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:26.720999 10499 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:26.721591 10499 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:26.726867 10499 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/instance:
uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0"
format_stamp: "Formatted at 2025-06-26 01:57:26 on dist-test-slave-k90p"
I20250626 01:57:26.728330 10499 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal/instance:
uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0"
format_stamp: "Formatted at 2025-06-26 01:57:26 on dist-test-slave-k90p"
I20250626 01:57:26.737890 10499 fs_manager.cc:696] Time spent creating directory manager: real 0.009s	user 0.001s	sys 0.008s
I20250626 01:57:26.744833 10515 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:26.746223 10499 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.003s	sys 0.002s
I20250626 01:57:26.746652 10499 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0"
format_stamp: "Formatted at 2025-06-26 01:57:26 on dist-test-slave-k90p"
I20250626 01:57:26.747053 10499 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:26.808351 10499 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:26.810264 10499 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:26.810796 10499 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:26.898931 10499 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.190:45805
I20250626 01:57:26.899008 10566 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.190:45805 every 8 connection(s)
I20250626 01:57:26.902262 10499 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
I20250626 01:57:26.906514 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 10499
I20250626 01:57:26.907059 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal/instance
I20250626 01:57:26.909255 10567 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:26.932497 10567 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Bootstrap starting.
I20250626 01:57:26.939425 10567 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:26.941608 10567 log.cc:826] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:26.948227 10567 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: No bootstrap required, opened a new log
I20250626 01:57:26.970137 10567 raft_consensus.cc:357] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:26.971249 10567 raft_consensus.cc:383] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:26.971513 10567 raft_consensus.cc:738] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6299a79b2e1e4378a76d9a5ce8ca3fe0, State: Initialized, Role: FOLLOWER
I20250626 01:57:26.972293 10567 consensus_queue.cc:260] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:26.972887 10567 raft_consensus.cc:397] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:26.973165 10567 raft_consensus.cc:491] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:26.973500 10567 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:26.978631 10567 raft_consensus.cc:513] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:26.979606 10567 leader_election.cc:304] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 6299a79b2e1e4378a76d9a5ce8ca3fe0; no voters: 
I20250626 01:57:26.981629 10567 leader_election.cc:290] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250626 01:57:26.982606 10572 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:26.985546 10572 raft_consensus.cc:695] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 LEADER]: Becoming Leader. State: Replica: 6299a79b2e1e4378a76d9a5ce8ca3fe0, State: Running, Role: LEADER
I20250626 01:57:26.986642 10567 sys_catalog.cc:564] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: configured and running, proceeding with master startup.
I20250626 01:57:26.986490 10572 consensus_queue.cc:237] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:26.999504 10573 sys_catalog.cc:455] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } } }
I20250626 01:57:27.000900 10573 sys_catalog.cc:458] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: This master's current role is: LEADER
I20250626 01:57:27.001163 10574 sys_catalog.cc:455] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 6299a79b2e1e4378a76d9a5ce8ca3fe0. Latest consensus state: current_term: 1 leader_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } } }
I20250626 01:57:27.002005 10574 sys_catalog.cc:458] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: This master's current role is: LEADER
I20250626 01:57:27.004848 10581 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250626 01:57:27.019542 10581 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250626 01:57:27.037632 10581 catalog_manager.cc:1349] Generated new cluster ID: 9dc4731a2814488e80ec7d7e92567af7
I20250626 01:57:27.038023 10581 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250626 01:57:27.084321 10581 catalog_manager.cc:1372] Generated new certificate authority record
I20250626 01:57:27.086812 10581 catalog_manager.cc:1506] Loading token signing keys...
I20250626 01:57:27.106318 10581 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Generated new TSK 0
I20250626 01:57:27.107829 10581 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250626 01:57:27.137751 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.10.62.129:0
--local_ip_for_outbound_sockets=127.10.62.129
--webserver_interface=127.10.62.129
--webserver_port=0
--tserver_master_addrs=127.10.62.190:45805
--builtin_ntp_servers=127.10.62.148:39413
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
W20250626 01:57:27.512800 10591 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:27.513450 10591 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:27.514032 10591 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:27.548887 10591 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:27.549943 10591 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.10.62.129
I20250626 01:57:27.590189 10591 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:39413
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.10.62.129:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.10.62.129
--webserver_port=0
--tserver_master_addrs=127.10.62.190:45805
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.10.62.129
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:27.591948 10591 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:27.594059 10591 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:27.616063 10598 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:27.617733 10600 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:27.617110 10597 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:27.618108 10591 server_base.cc:1048] running on GCE node
I20250626 01:57:28.834713 10591 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:28.838181 10591 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:28.839736 10591 hybrid_clock.cc:648] HybridClock initialized: now 1750903048839672 us; error 97 us; skew 500 ppm
I20250626 01:57:28.840698 10591 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:28.849507 10591 webserver.cc:469] Webserver started at http://127.10.62.129:40381/ using document root <none> and password file <none>
I20250626 01:57:28.850672 10591 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:28.850924 10591 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:28.851532 10591 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:28.857071 10591 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data/instance:
uuid: "2bd0037e52244b4dac3fbc01f4e904d1"
format_stamp: "Formatted at 2025-06-26 01:57:28 on dist-test-slave-k90p"
I20250626 01:57:28.858460 10591 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal/instance:
uuid: "2bd0037e52244b4dac3fbc01f4e904d1"
format_stamp: "Formatted at 2025-06-26 01:57:28 on dist-test-slave-k90p"
I20250626 01:57:28.867686 10591 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.002s	sys 0.005s
I20250626 01:57:28.874603 10607 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:28.875934 10591 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.003s	sys 0.000s
I20250626 01:57:28.876291 10591 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal
uuid: "2bd0037e52244b4dac3fbc01f4e904d1"
format_stamp: "Formatted at 2025-06-26 01:57:28 on dist-test-slave-k90p"
I20250626 01:57:28.876654 10591 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:28.934816 10591 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:28.936717 10591 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:28.937273 10591 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:28.941181 10591 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250626 01:57:28.946338 10591 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250626 01:57:28.946614 10591 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:28.946878 10591 ts_tablet_manager.cc:610] Registered 0 tablets
I20250626 01:57:28.947067 10591 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:29.122272 10591 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.129:43217
I20250626 01:57:29.122409 10719 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.129:43217 every 8 connection(s)
I20250626 01:57:29.125341 10591 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/data/info.pb
I20250626 01:57:29.131601 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 10591
I20250626 01:57:29.132112 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-0/wal/instance
I20250626 01:57:29.138993 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.10.62.130:0
--local_ip_for_outbound_sockets=127.10.62.130
--webserver_interface=127.10.62.130
--webserver_port=0
--tserver_master_addrs=127.10.62.190:45805
--builtin_ntp_servers=127.10.62.148:39413
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
I20250626 01:57:29.151355 10720 heartbeater.cc:344] Connected to a master server at 127.10.62.190:45805
I20250626 01:57:29.151909 10720 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:29.153162 10720 heartbeater.cc:507] Master 127.10.62.190:45805 requested a full tablet report, sending...
I20250626 01:57:29.156387 10532 ts_manager.cc:194] Registered new tserver with Master: 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129:43217)
I20250626 01:57:29.158635 10532 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.10.62.129:56553
W20250626 01:57:29.495420 10724 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:29.496018 10724 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:29.496618 10724 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:29.532577 10724 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:29.533553 10724 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.10.62.130
I20250626 01:57:29.573414 10724 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:39413
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.10.62.130:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.10.62.130
--webserver_port=0
--tserver_master_addrs=127.10.62.190:45805
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.10.62.130
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:29.575160 10724 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:29.577636 10724 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:29.597229 10731 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:30.163269 10720 heartbeater.cc:499] Master 127.10.62.190:45805 was elected leader, sending a full tablet report...
W20250626 01:57:29.597328 10730 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:29.600189 10733 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:30.801592 10732 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250626 01:57:30.801738 10724 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250626 01:57:30.806615 10724 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:30.809966 10724 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:30.811422 10724 hybrid_clock.cc:648] HybridClock initialized: now 1750903050811375 us; error 54 us; skew 500 ppm
I20250626 01:57:30.812414 10724 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:30.819906 10724 webserver.cc:469] Webserver started at http://127.10.62.130:38377/ using document root <none> and password file <none>
I20250626 01:57:30.821118 10724 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:30.821383 10724 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:30.821929 10724 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:30.827688 10724 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/instance:
uuid: "9a011f4230fc452abab60d8c23230642"
format_stamp: "Formatted at 2025-06-26 01:57:30 on dist-test-slave-k90p"
I20250626 01:57:30.829033 10724 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal/instance:
uuid: "9a011f4230fc452abab60d8c23230642"
format_stamp: "Formatted at 2025-06-26 01:57:30 on dist-test-slave-k90p"
I20250626 01:57:30.838052 10724 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.009s	sys 0.000s
I20250626 01:57:30.844980 10740 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:30.846200 10724 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.005s	sys 0.000s
I20250626 01:57:30.846570 10724 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
uuid: "9a011f4230fc452abab60d8c23230642"
format_stamp: "Formatted at 2025-06-26 01:57:30 on dist-test-slave-k90p"
I20250626 01:57:30.846978 10724 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:30.919689 10724 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:30.921566 10724 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:30.922092 10724 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:30.925243 10724 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250626 01:57:30.930331 10724 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250626 01:57:30.930616 10724 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:30.930928 10724 ts_tablet_manager.cc:610] Registered 0 tablets
I20250626 01:57:30.931156 10724 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.002s	sys 0.000s
I20250626 01:57:31.100710 10724 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.130:45677
I20250626 01:57:31.100832 10852 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.130:45677 every 8 connection(s)
I20250626 01:57:31.103796 10724 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
I20250626 01:57:31.110831 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 10724
I20250626 01:57:31.111395 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal/instance
I20250626 01:57:31.131533 10853 heartbeater.cc:344] Connected to a master server at 127.10.62.190:45805
I20250626 01:57:31.132113 10853 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:31.133663 10853 heartbeater.cc:507] Master 127.10.62.190:45805 requested a full tablet report, sending...
I20250626 01:57:31.136325 10531 ts_manager.cc:194] Registered new tserver with Master: 9a011f4230fc452abab60d8c23230642 (127.10.62.130:45677)
I20250626 01:57:31.137944 10531 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.10.62.130:44863
I20250626 01:57:31.149137 10490 external_mini_cluster.cc:934] 2 TS(s) registered with all masters
I20250626 01:57:31.189199 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 10724
I20250626 01:57:31.213827 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 10499
I20250626 01:57:31.253662 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.10.62.190:45805
--webserver_interface=127.10.62.190
--webserver_port=43985
--builtin_ntp_servers=127.10.62.148:39413
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.10.62.190:45805
--master_tombstone_evicted_tablet_replicas=false with env {}
W20250626 01:57:32.177604 10720 heartbeater.cc:646] Failed to heartbeat to 127.10.62.190:45805 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.10.62.190:45805: connect: Connection refused (error 111)
W20250626 01:57:32.879990 10864 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:32.880717 10864 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:32.881209 10864 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:32.916532 10864 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250626 01:57:32.916908 10864 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:32.917131 10864 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250626 01:57:32.917342 10864 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250626 01:57:32.958365 10864 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:39413
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
--master_tombstone_evicted_tablet_replicas=false
--ipki_ca_key_size=768
--master_addresses=127.10.62.190:45805
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.10.62.190:45805
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.10.62.190
--webserver_port=43985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:32.960155 10864 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:32.962445 10864 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:32.984030 10870 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:32.984022 10873 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:32.984022 10871 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:32.986343 10864 server_base.cc:1048] running on GCE node
I20250626 01:57:34.214612 10864 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:34.218205 10864 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:34.219695 10864 hybrid_clock.cc:648] HybridClock initialized: now 1750903054219659 us; error 72 us; skew 500 ppm
I20250626 01:57:34.220775 10864 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:34.235705 10864 webserver.cc:469] Webserver started at http://127.10.62.190:43985/ using document root <none> and password file <none>
I20250626 01:57:34.237078 10864 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:34.237411 10864 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:34.248546 10864 fs_manager.cc:714] Time spent opening directory manager: real 0.007s	user 0.007s	sys 0.001s
I20250626 01:57:34.254987 10882 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:34.256650 10864 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.002s	sys 0.001s
I20250626 01:57:34.257155 10864 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0"
format_stamp: "Formatted at 2025-06-26 01:57:26 on dist-test-slave-k90p"
I20250626 01:57:34.259791 10864 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:34.361712 10864 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:34.363706 10864 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:34.364296 10864 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:34.459228 10864 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.190:45805
I20250626 01:57:34.459465 10933 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.190:45805 every 8 connection(s)
I20250626 01:57:34.463932 10864 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
I20250626 01:57:34.464622 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 10864
I20250626 01:57:34.489380 10934 sys_catalog.cc:263] Verifying existing consensus state
I20250626 01:57:34.496464 10934 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Bootstrap starting.
I20250626 01:57:34.548226 10934 log.cc:826] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:34.570612 10934 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Bootstrap replayed 1/1 log segments. Stats: ops{read=4 overwritten=0 applied=4 ignored=0} inserts{seen=3 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250626 01:57:34.571959 10934 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Bootstrap complete.
I20250626 01:57:34.596253 10934 raft_consensus.cc:357] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:34.598959 10934 raft_consensus.cc:738] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6299a79b2e1e4378a76d9a5ce8ca3fe0, State: Initialized, Role: FOLLOWER
I20250626 01:57:34.600487 10934 consensus_queue.cc:260] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:34.601163 10934 raft_consensus.cc:397] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:34.601521 10934 raft_consensus.cc:491] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:34.601891 10934 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 1 FOLLOWER]: Advancing to term 2
I20250626 01:57:34.609639 10934 raft_consensus.cc:513] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:34.610610 10934 leader_election.cc:304] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 6299a79b2e1e4378a76d9a5ce8ca3fe0; no voters: 
I20250626 01:57:34.612727 10934 leader_election.cc:290] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [CANDIDATE]: Term 2 election: Requested vote from peers 
I20250626 01:57:34.613323 10939 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 2 FOLLOWER]: Leader election won for term 2
I20250626 01:57:34.617488 10939 raft_consensus.cc:695] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [term 2 LEADER]: Becoming Leader. State: Replica: 6299a79b2e1e4378a76d9a5ce8ca3fe0, State: Running, Role: LEADER
I20250626 01:57:34.618716 10939 consensus_queue.cc:237] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 4, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } }
I20250626 01:57:34.619912 10934 sys_catalog.cc:564] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: configured and running, proceeding with master startup.
I20250626 01:57:34.644941 10941 sys_catalog.cc:455] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 6299a79b2e1e4378a76d9a5ce8ca3fe0. Latest consensus state: current_term: 2 leader_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } } }
I20250626 01:57:34.645879 10941 sys_catalog.cc:458] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: This master's current role is: LEADER
I20250626 01:57:34.648944 10940 sys_catalog.cc:455] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6299a79b2e1e4378a76d9a5ce8ca3fe0" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 45805 } } }
I20250626 01:57:34.649734 10950 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250626 01:57:34.649825 10940 sys_catalog.cc:458] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0 [sys.catalog]: This master's current role is: LEADER
I20250626 01:57:34.664973 10950 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250626 01:57:34.672811 10950 catalog_manager.cc:1261] Loaded cluster ID: 9dc4731a2814488e80ec7d7e92567af7
I20250626 01:57:34.673382 10950 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250626 01:57:34.681758 10950 catalog_manager.cc:1506] Loading token signing keys...
I20250626 01:57:34.686226 10950 catalog_manager.cc:5966] T 00000000000000000000000000000000 P 6299a79b2e1e4378a76d9a5ce8ca3fe0: Loaded TSK: 0
I20250626 01:57:34.687988 10950 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250626 01:57:35.206621 10720 heartbeater.cc:344] Connected to a master server at 127.10.62.190:45805
I20250626 01:57:35.212357 10898 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" instance_seqno: 1750903049084308) as {username='slave'} at 127.10.62.129:49845; Asking this server to re-register.
I20250626 01:57:35.214370 10720 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:35.215193 10720 heartbeater.cc:507] Master 127.10.62.190:45805 requested a full tablet report, sending...
I20250626 01:57:35.218202 10898 ts_manager.cc:194] Registered new tserver with Master: 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129:43217)
I20250626 01:57:35.225430 10490 external_mini_cluster.cc:934] 1 TS(s) registered with all masters
I20250626 01:57:35.226203 10490 test_util.cc:276] Using random seed: -492530377
I20250626 01:57:35.300856 10898 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:42704:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 1
split_rows_range_bounds {
  rows: "<redacted>""\004\001\000\377\377\377\037\004\001\000\376\377\377?\004\001\000\375\377\377_"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20250626 01:57:35.377465 10653 tablet_service.cc:1468] Processing CreateTablet for tablet 96f7f13eaa024c4c9a537775a646567d (DEFAULT_TABLE table=test-workload [id=f2b1f648579a4e60a8d59d3804821554]), partition=RANGE (key) PARTITION 1073741822 <= VALUES < 1610612733
I20250626 01:57:35.378870 10652 tablet_service.cc:1468] Processing CreateTablet for tablet f962b93a2dc243e6a30d87e43ed96ada (DEFAULT_TABLE table=test-workload [id=f2b1f648579a4e60a8d59d3804821554]), partition=RANGE (key) PARTITION 1610612733 <= VALUES
I20250626 01:57:35.378810 10654 tablet_service.cc:1468] Processing CreateTablet for tablet 110d8d0236f24a2d9e3db0d40b1f4055 (DEFAULT_TABLE table=test-workload [id=f2b1f648579a4e60a8d59d3804821554]), partition=RANGE (key) PARTITION 536870911 <= VALUES < 1073741822
I20250626 01:57:35.377979 10655 tablet_service.cc:1468] Processing CreateTablet for tablet 53317ba56bb94773ba0bd55bf74169f5 (DEFAULT_TABLE table=test-workload [id=f2b1f648579a4e60a8d59d3804821554]), partition=RANGE (key) PARTITION VALUES < 536870911
I20250626 01:57:35.381162 10653 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 96f7f13eaa024c4c9a537775a646567d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:35.382071 10655 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 53317ba56bb94773ba0bd55bf74169f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:35.382867 10652 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f962b93a2dc243e6a30d87e43ed96ada. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:35.384862 10654 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 110d8d0236f24a2d9e3db0d40b1f4055. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:35.430015 10970 tablet_bootstrap.cc:492] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: Bootstrap starting.
I20250626 01:57:35.437953 10970 tablet_bootstrap.cc:654] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:35.441102 10970 log.cc:826] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:35.448647 10970 tablet_bootstrap.cc:492] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: No bootstrap required, opened a new log
I20250626 01:57:35.449462 10970 ts_tablet_manager.cc:1397] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent bootstrapping tablet: real 0.020s	user 0.012s	sys 0.006s
I20250626 01:57:35.475235 10970 raft_consensus.cc:357] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.476197 10970 raft_consensus.cc:383] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:35.476637 10970 raft_consensus.cc:738] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Initialized, Role: FOLLOWER
I20250626 01:57:35.477877 10970 consensus_queue.cc:260] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.478590 10970 raft_consensus.cc:397] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:35.478935 10970 raft_consensus.cc:491] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:35.479375 10970 raft_consensus.cc:3058] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:35.485141 10970 raft_consensus.cc:513] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.486577 10970 leader_election.cc:304] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 2bd0037e52244b4dac3fbc01f4e904d1; no voters: 
I20250626 01:57:35.490283 10970 leader_election.cc:290] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250626 01:57:35.490813 10972 raft_consensus.cc:2802] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:35.502696 10970 ts_tablet_manager.cc:1428] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent starting tablet: real 0.053s	user 0.040s	sys 0.014s
I20250626 01:57:35.504647 10970 tablet_bootstrap.cc:492] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1: Bootstrap starting.
I20250626 01:57:35.507877 10972 raft_consensus.cc:695] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 LEADER]: Becoming Leader. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Running, Role: LEADER
I20250626 01:57:35.509548 10972 consensus_queue.cc:237] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.514196 10970 tablet_bootstrap.cc:654] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:35.528664 10970 tablet_bootstrap.cc:492] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1: No bootstrap required, opened a new log
I20250626 01:57:35.529356 10970 ts_tablet_manager.cc:1397] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent bootstrapping tablet: real 0.025s	user 0.019s	sys 0.003s
I20250626 01:57:35.533830 10970 raft_consensus.cc:357] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.535013 10970 raft_consensus.cc:383] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:35.535594 10970 raft_consensus.cc:738] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Initialized, Role: FOLLOWER
I20250626 01:57:35.537004 10970 consensus_queue.cc:260] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.538317 10970 raft_consensus.cc:397] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:35.538863 10970 raft_consensus.cc:491] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:35.539461 10970 raft_consensus.cc:3058] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:35.548638 10970 raft_consensus.cc:513] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.549762 10970 leader_election.cc:304] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 2bd0037e52244b4dac3fbc01f4e904d1; no voters: 
I20250626 01:57:35.550685 10970 leader_election.cc:290] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250626 01:57:35.549947 10898 catalog_manager.cc:5582] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129). New cstate: current_term: 1 leader_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } health_report { overall_health: HEALTHY } } }
I20250626 01:57:35.551121 10974 raft_consensus.cc:2802] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:35.553077 10970 ts_tablet_manager.cc:1428] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent starting tablet: real 0.023s	user 0.020s	sys 0.000s
I20250626 01:57:35.554327 10970 tablet_bootstrap.cc:492] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1: Bootstrap starting.
I20250626 01:57:35.562458 10970 tablet_bootstrap.cc:654] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:35.571208 10974 raft_consensus.cc:695] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 LEADER]: Becoming Leader. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Running, Role: LEADER
I20250626 01:57:35.572521 10974 consensus_queue.cc:237] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.573974 10970 tablet_bootstrap.cc:492] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1: No bootstrap required, opened a new log
I20250626 01:57:35.574538 10970 ts_tablet_manager.cc:1397] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent bootstrapping tablet: real 0.020s	user 0.013s	sys 0.004s
I20250626 01:57:35.578555 10970 raft_consensus.cc:357] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.579501 10970 raft_consensus.cc:383] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:35.579880 10970 raft_consensus.cc:738] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Initialized, Role: FOLLOWER
I20250626 01:57:35.580811 10970 consensus_queue.cc:260] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.581830 10970 raft_consensus.cc:397] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:35.582249 10970 raft_consensus.cc:491] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:35.582749 10970 raft_consensus.cc:3058] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:35.594218 10970 raft_consensus.cc:513] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.595012 10970 leader_election.cc:304] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 2bd0037e52244b4dac3fbc01f4e904d1; no voters: 
I20250626 01:57:35.596199 10978 raft_consensus.cc:2802] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:35.596808 10970 leader_election.cc:290] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250626 01:57:35.596895 10978 raft_consensus.cc:695] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 LEADER]: Becoming Leader. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Running, Role: LEADER
I20250626 01:57:35.597788 10978 consensus_queue.cc:237] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.599568 10970 ts_tablet_manager.cc:1428] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent starting tablet: real 0.025s	user 0.020s	sys 0.004s
I20250626 01:57:35.599467 10898 catalog_manager.cc:5582] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129). New cstate: current_term: 1 leader_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } health_report { overall_health: HEALTHY } } }
I20250626 01:57:35.601584 10970 tablet_bootstrap.cc:492] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: Bootstrap starting.
I20250626 01:57:35.612108 10970 tablet_bootstrap.cc:654] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:35.619328 10899 catalog_manager.cc:5582] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129). New cstate: current_term: 1 leader_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } health_report { overall_health: HEALTHY } } }
I20250626 01:57:35.620690 10970 tablet_bootstrap.cc:492] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: No bootstrap required, opened a new log
I20250626 01:57:35.621243 10970 ts_tablet_manager.cc:1397] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent bootstrapping tablet: real 0.020s	user 0.009s	sys 0.008s
I20250626 01:57:35.624466 10970 raft_consensus.cc:357] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.625197 10970 raft_consensus.cc:383] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:35.625531 10970 raft_consensus.cc:738] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Initialized, Role: FOLLOWER
I20250626 01:57:35.626430 10970 consensus_queue.cc:260] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.627836 10970 raft_consensus.cc:397] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:35.628340 10970 raft_consensus.cc:491] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:35.628726 10970 raft_consensus.cc:3058] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:35.636356 10970 raft_consensus.cc:513] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:35.637365 10970 leader_election.cc:304] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 2bd0037e52244b4dac3fbc01f4e904d1; no voters: 
I20250626 01:57:35.638151 10970 leader_election.cc:290] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250626 01:57:35.638381 10978 raft_consensus.cc:2802] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:35.639672 10978 raft_consensus.cc:695] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [term 1 LEADER]: Becoming Leader. State: Replica: 2bd0037e52244b4dac3fbc01f4e904d1, State: Running, Role: LEADER
I20250626 01:57:35.640738 10970 ts_tablet_manager.cc:1428] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: Time spent starting tablet: real 0.019s	user 0.014s	sys 0.004s
I20250626 01:57:35.640558 10978 consensus_queue.cc:237] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
W20250626 01:57:35.641534 10721 tablet.cc:2378] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250626 01:57:35.649613 10899 catalog_manager.cc:5582] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1 reported cstate change: term changed from 0 to 1, leader changed from <none> to 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129). New cstate: current_term: 1 leader_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } health_report { overall_health: HEALTHY } } }
W20250626 01:57:37.015443 10990 meta_cache.cc:1261] Time spent looking up entry by key: real 0.063s	user 0.001s	sys 0.000s
W20250626 01:57:39.311295 10991 meta_cache.cc:1261] Time spent looking up entry by key: real 0.084s	user 0.004s	sys 0.000s
I20250626 01:57:41.605423 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.10.62.130:45677
--local_ip_for_outbound_sockets=127.10.62.130
--tserver_master_addrs=127.10.62.190:45805
--webserver_port=38377
--webserver_interface=127.10.62.130
--builtin_ntp_servers=127.10.62.148:39413
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
W20250626 01:57:42.014108 11010 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:42.014886 11010 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:42.015537 11010 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:42.053758 11010 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:42.054888 11010 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.10.62.130
I20250626 01:57:42.104414 11010 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:39413
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.10.62.130:45677
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.10.62.130
--webserver_port=38377
--tserver_master_addrs=127.10.62.190:45805
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.10.62.130
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:42.106298 11010 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:42.108879 11010 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:42.132354 11016 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:42.133719 11017 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:42.136826 11019 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:43.377765 11010 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
W20250626 01:57:43.378438 11018 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250626 01:57:43.383742 11010 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:43.387585 11010 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:43.389285 11010 hybrid_clock.cc:648] HybridClock initialized: now 1750903063389227 us; error 59 us; skew 500 ppm
I20250626 01:57:43.390412 11010 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:43.399912 11010 webserver.cc:469] Webserver started at http://127.10.62.130:38377/ using document root <none> and password file <none>
I20250626 01:57:43.401225 11010 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:43.401614 11010 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:43.412700 11010 fs_manager.cc:714] Time spent opening directory manager: real 0.007s	user 0.007s	sys 0.001s
I20250626 01:57:43.419344 11027 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:43.420976 11010 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.002s	sys 0.001s
I20250626 01:57:43.421490 11010 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
uuid: "9a011f4230fc452abab60d8c23230642"
format_stamp: "Formatted at 2025-06-26 01:57:30 on dist-test-slave-k90p"
I20250626 01:57:43.423981 11010 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:43.487529 11010 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:43.489640 11010 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:43.490281 11010 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:43.493965 11010 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250626 01:57:43.503273 11010 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250626 01:57:43.503599 11010 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:43.503916 11010 ts_tablet_manager.cc:610] Registered 0 tablets
I20250626 01:57:43.504091 11010 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:43.695974 11010 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.130:45677
I20250626 01:57:43.696141 11139 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.130:45677 every 8 connection(s)
I20250626 01:57:43.699615 11010 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
I20250626 01:57:43.709635 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 11010
I20250626 01:57:43.736193 11140 heartbeater.cc:344] Connected to a master server at 127.10.62.190:45805
I20250626 01:57:43.736855 11140 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:43.738583 11140 heartbeater.cc:507] Master 127.10.62.190:45805 requested a full tablet report, sending...
I20250626 01:57:43.742125 10897 ts_manager.cc:194] Registered new tserver with Master: 9a011f4230fc452abab60d8c23230642 (127.10.62.130:45677)
I20250626 01:57:43.743443 11146 ts_tablet_manager.cc:927] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Initiating tablet copy from peer 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129:43217)
I20250626 01:57:43.744881 10897 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.10.62.130:57499
I20250626 01:57:43.747521 11146 tablet_copy_client.cc:323] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: tablet copy: Beginning tablet copy session from remote peer at address 127.10.62.129:43217
I20250626 01:57:43.759819 10695 tablet_copy_service.cc:140] P 2bd0037e52244b4dac3fbc01f4e904d1: Received BeginTabletCopySession request for tablet 110d8d0236f24a2d9e3db0d40b1f4055 from peer 9a011f4230fc452abab60d8c23230642 ({username='slave'} at 127.10.62.130:54599)
I20250626 01:57:43.760567 10695 tablet_copy_service.cc:161] P 2bd0037e52244b4dac3fbc01f4e904d1: Beginning new tablet copy session on tablet 110d8d0236f24a2d9e3db0d40b1f4055 from peer 9a011f4230fc452abab60d8c23230642 at {username='slave'} at 127.10.62.130:54599: session id = 9a011f4230fc452abab60d8c23230642-110d8d0236f24a2d9e3db0d40b1f4055
I20250626 01:57:43.768748 10695 tablet_copy_source_session.cc:215] T 110d8d0236f24a2d9e3db0d40b1f4055 P 2bd0037e52244b4dac3fbc01f4e904d1: Tablet Copy: opened 0 blocks and 1 log segments
I20250626 01:57:43.775933 11146 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 110d8d0236f24a2d9e3db0d40b1f4055. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:43.798775 11146 tablet_copy_client.cc:806] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 0 data blocks...
I20250626 01:57:43.799707 11146 tablet_copy_client.cc:670] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 1 WAL segments...
I20250626 01:57:43.814174 11146 tablet_copy_client.cc:538] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250626 01:57:43.827960 11146 tablet_bootstrap.cc:492] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Bootstrap starting.
I20250626 01:57:43.990020 11146 log.cc:826] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:44.750310 11140 heartbeater.cc:499] Master 127.10.62.190:45805 was elected leader, sending a full tablet report...
I20250626 01:57:45.317655 11146 tablet_bootstrap.cc:492] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Bootstrap replayed 1/1 log segments. Stats: ops{read=218 overwritten=0 applied=218 ignored=0} inserts{seen=2657 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250626 01:57:45.318720 11146 tablet_bootstrap.cc:492] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Bootstrap complete.
I20250626 01:57:45.319514 11146 ts_tablet_manager.cc:1397] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Time spent bootstrapping tablet: real 1.492s	user 1.424s	sys 0.065s
I20250626 01:57:45.336550 11146 raft_consensus.cc:357] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:45.337579 11146 raft_consensus.cc:738] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 9a011f4230fc452abab60d8c23230642, State: Initialized, Role: NON_PARTICIPANT
I20250626 01:57:45.338483 11146 consensus_queue.cc:260] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 218, Last appended: 1.218, Last appended by leader: 218, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:45.342561 11146 ts_tablet_manager.cc:1428] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Time spent starting tablet: real 0.023s	user 0.022s	sys 0.000s
I20250626 01:57:45.344722 10695 tablet_copy_service.cc:342] P 2bd0037e52244b4dac3fbc01f4e904d1: Request end of tablet copy session 9a011f4230fc452abab60d8c23230642-110d8d0236f24a2d9e3db0d40b1f4055 received from {username='slave'} at 127.10.62.130:54599
I20250626 01:57:45.345212 10695 tablet_copy_service.cc:434] P 2bd0037e52244b4dac3fbc01f4e904d1: ending tablet copy session 9a011f4230fc452abab60d8c23230642-110d8d0236f24a2d9e3db0d40b1f4055 on tablet 110d8d0236f24a2d9e3db0d40b1f4055 with peer 9a011f4230fc452abab60d8c23230642
W20250626 01:57:45.348754 11146 ts_tablet_manager.cc:726] T 110d8d0236f24a2d9e3db0d40b1f4055 P 9a011f4230fc452abab60d8c23230642: Tablet Copy: Invalid argument: Leader has replica of tablet 110d8d0236f24a2d9e3db0d40b1f4055 with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250626 01:57:45.356948 11146 ts_tablet_manager.cc:927] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Initiating tablet copy from peer 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129:43217)
I20250626 01:57:45.359329 11146 tablet_copy_client.cc:323] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: tablet copy: Beginning tablet copy session from remote peer at address 127.10.62.129:43217
I20250626 01:57:45.361321 10695 tablet_copy_service.cc:140] P 2bd0037e52244b4dac3fbc01f4e904d1: Received BeginTabletCopySession request for tablet 53317ba56bb94773ba0bd55bf74169f5 from peer 9a011f4230fc452abab60d8c23230642 ({username='slave'} at 127.10.62.130:54599)
I20250626 01:57:45.361953 10695 tablet_copy_service.cc:161] P 2bd0037e52244b4dac3fbc01f4e904d1: Beginning new tablet copy session on tablet 53317ba56bb94773ba0bd55bf74169f5 from peer 9a011f4230fc452abab60d8c23230642 at {username='slave'} at 127.10.62.130:54599: session id = 9a011f4230fc452abab60d8c23230642-53317ba56bb94773ba0bd55bf74169f5
I20250626 01:57:45.367723 10695 tablet_copy_source_session.cc:215] T 53317ba56bb94773ba0bd55bf74169f5 P 2bd0037e52244b4dac3fbc01f4e904d1: Tablet Copy: opened 0 blocks and 1 log segments
I20250626 01:57:45.370419 11146 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 53317ba56bb94773ba0bd55bf74169f5. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:45.380599 11146 tablet_copy_client.cc:806] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 0 data blocks...
I20250626 01:57:45.381162 11146 tablet_copy_client.cc:670] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 1 WAL segments...
I20250626 01:57:45.394434 11146 tablet_copy_client.cc:538] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250626 01:57:45.400707 11146 tablet_bootstrap.cc:492] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Bootstrap starting.
I20250626 01:57:46.666944 11146 tablet_bootstrap.cc:492] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Bootstrap replayed 1/1 log segments. Stats: ops{read=218 overwritten=0 applied=218 ignored=0} inserts{seen=2674 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250626 01:57:46.667948 11146 tablet_bootstrap.cc:492] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Bootstrap complete.
I20250626 01:57:46.668637 11146 ts_tablet_manager.cc:1397] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Time spent bootstrapping tablet: real 1.268s	user 1.232s	sys 0.036s
I20250626 01:57:46.671221 11146 raft_consensus.cc:357] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:46.671664 11146 raft_consensus.cc:738] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 9a011f4230fc452abab60d8c23230642, State: Initialized, Role: NON_PARTICIPANT
I20250626 01:57:46.672204 11146 consensus_queue.cc:260] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 218, Last appended: 1.218, Last appended by leader: 218, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:46.677346 11146 ts_tablet_manager.cc:1428] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Time spent starting tablet: real 0.008s	user 0.005s	sys 0.004s
I20250626 01:57:46.679455 10695 tablet_copy_service.cc:342] P 2bd0037e52244b4dac3fbc01f4e904d1: Request end of tablet copy session 9a011f4230fc452abab60d8c23230642-53317ba56bb94773ba0bd55bf74169f5 received from {username='slave'} at 127.10.62.130:54599
I20250626 01:57:46.680150 10695 tablet_copy_service.cc:434] P 2bd0037e52244b4dac3fbc01f4e904d1: ending tablet copy session 9a011f4230fc452abab60d8c23230642-53317ba56bb94773ba0bd55bf74169f5 on tablet 53317ba56bb94773ba0bd55bf74169f5 with peer 9a011f4230fc452abab60d8c23230642
I20250626 01:57:46.683701 11146 ts_tablet_manager.cc:927] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Initiating tablet copy from peer 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129:43217)
I20250626 01:57:46.686014 11146 tablet_copy_client.cc:323] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: tablet copy: Beginning tablet copy session from remote peer at address 127.10.62.129:43217
I20250626 01:57:46.688053 10695 tablet_copy_service.cc:140] P 2bd0037e52244b4dac3fbc01f4e904d1: Received BeginTabletCopySession request for tablet f962b93a2dc243e6a30d87e43ed96ada from peer 9a011f4230fc452abab60d8c23230642 ({username='slave'} at 127.10.62.130:54599)
I20250626 01:57:46.688799 10695 tablet_copy_service.cc:161] P 2bd0037e52244b4dac3fbc01f4e904d1: Beginning new tablet copy session on tablet f962b93a2dc243e6a30d87e43ed96ada from peer 9a011f4230fc452abab60d8c23230642 at {username='slave'} at 127.10.62.130:54599: session id = 9a011f4230fc452abab60d8c23230642-f962b93a2dc243e6a30d87e43ed96ada
I20250626 01:57:46.699483 10695 tablet_copy_source_session.cc:215] T f962b93a2dc243e6a30d87e43ed96ada P 2bd0037e52244b4dac3fbc01f4e904d1: Tablet Copy: opened 0 blocks and 1 log segments
I20250626 01:57:46.703213 11146 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet f962b93a2dc243e6a30d87e43ed96ada. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:46.715143 11146 tablet_copy_client.cc:806] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 0 data blocks...
I20250626 01:57:46.715924 11146 tablet_copy_client.cc:670] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 1 WAL segments...
I20250626 01:57:46.737042 11146 tablet_copy_client.cc:538] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250626 01:57:46.745004 11146 tablet_bootstrap.cc:492] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Bootstrap starting.
I20250626 01:57:48.055904 11146 tablet_bootstrap.cc:492] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Bootstrap replayed 1/1 log segments. Stats: ops{read=218 overwritten=0 applied=218 ignored=0} inserts{seen=2747 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250626 01:57:48.056861 11146 tablet_bootstrap.cc:492] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Bootstrap complete.
I20250626 01:57:48.057509 11146 ts_tablet_manager.cc:1397] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Time spent bootstrapping tablet: real 1.313s	user 1.274s	sys 0.036s
I20250626 01:57:48.059687 11146 raft_consensus.cc:357] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:48.060281 11146 raft_consensus.cc:738] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 9a011f4230fc452abab60d8c23230642, State: Initialized, Role: NON_PARTICIPANT
I20250626 01:57:48.060796 11146 consensus_queue.cc:260] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 218, Last appended: 1.218, Last appended by leader: 218, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:48.064078 11146 ts_tablet_manager.cc:1428] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Time spent starting tablet: real 0.006s	user 0.008s	sys 0.000s
I20250626 01:57:48.066680 10695 tablet_copy_service.cc:342] P 2bd0037e52244b4dac3fbc01f4e904d1: Request end of tablet copy session 9a011f4230fc452abab60d8c23230642-f962b93a2dc243e6a30d87e43ed96ada received from {username='slave'} at 127.10.62.130:54599
I20250626 01:57:48.067487 10695 tablet_copy_service.cc:434] P 2bd0037e52244b4dac3fbc01f4e904d1: ending tablet copy session 9a011f4230fc452abab60d8c23230642-f962b93a2dc243e6a30d87e43ed96ada on tablet f962b93a2dc243e6a30d87e43ed96ada with peer 9a011f4230fc452abab60d8c23230642
W20250626 01:57:48.073048 11146 ts_tablet_manager.cc:726] T f962b93a2dc243e6a30d87e43ed96ada P 9a011f4230fc452abab60d8c23230642: Tablet Copy: Invalid argument: Leader has replica of tablet f962b93a2dc243e6a30d87e43ed96ada with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
W20250626 01:57:48.081287 11146 ts_tablet_manager.cc:726] T 53317ba56bb94773ba0bd55bf74169f5 P 9a011f4230fc452abab60d8c23230642: Tablet Copy: Invalid argument: Leader has replica of tablet 53317ba56bb94773ba0bd55bf74169f5 with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250626 01:57:48.086607 11146 ts_tablet_manager.cc:927] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Initiating tablet copy from peer 2bd0037e52244b4dac3fbc01f4e904d1 (127.10.62.129:43217)
I20250626 01:57:48.088140 11146 tablet_copy_client.cc:323] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: tablet copy: Beginning tablet copy session from remote peer at address 127.10.62.129:43217
I20250626 01:57:48.090059 10695 tablet_copy_service.cc:140] P 2bd0037e52244b4dac3fbc01f4e904d1: Received BeginTabletCopySession request for tablet 96f7f13eaa024c4c9a537775a646567d from peer 9a011f4230fc452abab60d8c23230642 ({username='slave'} at 127.10.62.130:54599)
I20250626 01:57:48.090615 10695 tablet_copy_service.cc:161] P 2bd0037e52244b4dac3fbc01f4e904d1: Beginning new tablet copy session on tablet 96f7f13eaa024c4c9a537775a646567d from peer 9a011f4230fc452abab60d8c23230642 at {username='slave'} at 127.10.62.130:54599: session id = 9a011f4230fc452abab60d8c23230642-96f7f13eaa024c4c9a537775a646567d
I20250626 01:57:48.098654 10695 tablet_copy_source_session.cc:215] T 96f7f13eaa024c4c9a537775a646567d P 2bd0037e52244b4dac3fbc01f4e904d1: Tablet Copy: opened 0 blocks and 1 log segments
I20250626 01:57:48.102003 11146 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 96f7f13eaa024c4c9a537775a646567d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:48.112813 11146 tablet_copy_client.cc:806] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 0 data blocks...
I20250626 01:57:48.113458 11146 tablet_copy_client.cc:670] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: tablet copy: Starting download of 1 WAL segments...
I20250626 01:57:48.128650 11146 tablet_copy_client.cc:538] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250626 01:57:48.136515 11146 tablet_bootstrap.cc:492] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Bootstrap starting.
W20250626 01:57:49.403434 11136 debug-util.cc:398] Leaking SignalData structure 0x7b08000ac0a0 after lost signal to thread 11011
W20250626 01:57:49.404686 11136 debug-util.cc:398] Leaking SignalData structure 0x7b08000ccac0 after lost signal to thread 11139
I20250626 01:57:49.550804 11146 tablet_bootstrap.cc:492] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Bootstrap replayed 1/1 log segments. Stats: ops{read=218 overwritten=0 applied=218 ignored=0} inserts{seen=2772 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250626 01:57:49.551966 11146 tablet_bootstrap.cc:492] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Bootstrap complete.
I20250626 01:57:49.552785 11146 ts_tablet_manager.cc:1397] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Time spent bootstrapping tablet: real 1.416s	user 1.381s	sys 0.024s
I20250626 01:57:49.556066 11146 raft_consensus.cc:357] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:49.556732 11146 raft_consensus.cc:738] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 9a011f4230fc452abab60d8c23230642, State: Initialized, Role: NON_PARTICIPANT
I20250626 01:57:49.557379 11146 consensus_queue.cc:260] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 218, Last appended: 1.218, Last appended by leader: 218, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "2bd0037e52244b4dac3fbc01f4e904d1" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 43217 } }
I20250626 01:57:49.561156 11146 ts_tablet_manager.cc:1428] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Time spent starting tablet: real 0.008s	user 0.005s	sys 0.002s
I20250626 01:57:49.564321 10695 tablet_copy_service.cc:342] P 2bd0037e52244b4dac3fbc01f4e904d1: Request end of tablet copy session 9a011f4230fc452abab60d8c23230642-96f7f13eaa024c4c9a537775a646567d received from {username='slave'} at 127.10.62.130:54599
I20250626 01:57:49.564985 10695 tablet_copy_service.cc:434] P 2bd0037e52244b4dac3fbc01f4e904d1: ending tablet copy session 9a011f4230fc452abab60d8c23230642-96f7f13eaa024c4c9a537775a646567d on tablet 96f7f13eaa024c4c9a537775a646567d with peer 9a011f4230fc452abab60d8c23230642
W20250626 01:57:49.570117 11146 ts_tablet_manager.cc:726] T 96f7f13eaa024c4c9a537775a646567d P 9a011f4230fc452abab60d8c23230642: Tablet Copy: Invalid argument: Leader has replica of tablet 96f7f13eaa024c4c9a537775a646567d with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250626 01:57:49.575383 10490 tablet_copy-itest.cc:1252] Number of Service unavailable responses: 1226
I20250626 01:57:49.575881 10490 tablet_copy-itest.cc:1253] Number of in progress responses: 916
I20250626 01:57:49.582574 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 10591
I20250626 01:57:49.644861 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11010
I20250626 01:57:49.687438 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 10864
2025-06-26T01:57:49Z chronyd exiting
[       OK ] TabletCopyITest.TestTabletCopyThrottling (24804 ms)
[ RUN      ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate
2025-06-26T01:57:49Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-26T01:57:49Z Disabled control of system clock
I20250626 01:57:49.818058 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.10.62.190:42257
--webserver_interface=127.10.62.190
--webserver_port=0
--builtin_ntp_servers=127.10.62.148:33005
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.10.62.190:42257 with env {}
W20250626 01:57:50.199991 11165 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:50.200816 11165 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:50.201390 11165 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:50.237727 11165 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250626 01:57:50.238132 11165 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:50.238463 11165 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250626 01:57:50.238741 11165 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250626 01:57:50.281246 11165 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:33005
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.10.62.190:42257
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.10.62.190:42257
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.10.62.190
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:50.283170 11165 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:50.285524 11165 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:50.304791 11171 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:50.305563 11172 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:50.306926 11174 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:51.534780 11173 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250626 01:57:51.534855 11165 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250626 01:57:51.539911 11165 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:51.543639 11165 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:51.545182 11165 hybrid_clock.cc:648] HybridClock initialized: now 1750903071545126 us; error 86 us; skew 500 ppm
I20250626 01:57:51.546298 11165 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:51.555217 11165 webserver.cc:469] Webserver started at http://127.10.62.190:41677/ using document root <none> and password file <none>
I20250626 01:57:51.556849 11165 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:51.557385 11165 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:51.558210 11165 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:51.564822 11165 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data/instance:
uuid: "b25b267a56dc454997f429334e96abf3"
format_stamp: "Formatted at 2025-06-26 01:57:51 on dist-test-slave-k90p"
I20250626 01:57:51.566520 11165 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal/instance:
uuid: "b25b267a56dc454997f429334e96abf3"
format_stamp: "Formatted at 2025-06-26 01:57:51 on dist-test-slave-k90p"
I20250626 01:57:51.577477 11165 fs_manager.cc:696] Time spent creating directory manager: real 0.010s	user 0.007s	sys 0.001s
I20250626 01:57:51.585536 11181 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:51.587188 11165 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.002s	sys 0.004s
I20250626 01:57:51.587874 11165 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal
uuid: "b25b267a56dc454997f429334e96abf3"
format_stamp: "Formatted at 2025-06-26 01:57:51 on dist-test-slave-k90p"
I20250626 01:57:51.588346 11165 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:51.646235 11165 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:51.648304 11165 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:51.648927 11165 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:51.746733 11165 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.190:42257
I20250626 01:57:51.746838 11232 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.190:42257 every 8 connection(s)
I20250626 01:57:51.750381 11165 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/data/info.pb
I20250626 01:57:51.752684 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 11165
I20250626 01:57:51.753263 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/master-0/wal/instance
I20250626 01:57:51.759450 11233 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:51.790719 11233 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3: Bootstrap starting.
I20250626 01:57:51.799291 11233 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:51.801698 11233 log.cc:826] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:51.808769 11233 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3: No bootstrap required, opened a new log
I20250626 01:57:51.831431 11233 raft_consensus.cc:357] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b25b267a56dc454997f429334e96abf3" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 42257 } }
I20250626 01:57:51.832404 11233 raft_consensus.cc:383] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:51.832731 11233 raft_consensus.cc:738] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b25b267a56dc454997f429334e96abf3, State: Initialized, Role: FOLLOWER
I20250626 01:57:51.833673 11233 consensus_queue.cc:260] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b25b267a56dc454997f429334e96abf3" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 42257 } }
I20250626 01:57:51.834357 11233 raft_consensus.cc:397] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250626 01:57:51.834710 11233 raft_consensus.cc:491] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250626 01:57:51.835261 11233 raft_consensus.cc:3058] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:51.841964 11233 raft_consensus.cc:513] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b25b267a56dc454997f429334e96abf3" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 42257 } }
I20250626 01:57:51.843062 11233 leader_election.cc:304] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: b25b267a56dc454997f429334e96abf3; no voters: 
I20250626 01:57:51.845392 11233 leader_election.cc:290] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250626 01:57:51.846292 11238 raft_consensus.cc:2802] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:51.849292 11238 raft_consensus.cc:695] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [term 1 LEADER]: Becoming Leader. State: Replica: b25b267a56dc454997f429334e96abf3, State: Running, Role: LEADER
I20250626 01:57:51.850275 11238 consensus_queue.cc:237] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b25b267a56dc454997f429334e96abf3" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 42257 } }
I20250626 01:57:51.851352 11233 sys_catalog.cc:564] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [sys.catalog]: configured and running, proceeding with master startup.
I20250626 01:57:51.865283 11240 sys_catalog.cc:455] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [sys.catalog]: SysCatalogTable state changed. Reason: New leader b25b267a56dc454997f429334e96abf3. Latest consensus state: current_term: 1 leader_uuid: "b25b267a56dc454997f429334e96abf3" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b25b267a56dc454997f429334e96abf3" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 42257 } } }
I20250626 01:57:51.865945 11239 sys_catalog.cc:455] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "b25b267a56dc454997f429334e96abf3" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b25b267a56dc454997f429334e96abf3" member_type: VOTER last_known_addr { host: "127.10.62.190" port: 42257 } } }
I20250626 01:57:51.866362 11240 sys_catalog.cc:458] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [sys.catalog]: This master's current role is: LEADER
I20250626 01:57:51.866613 11239 sys_catalog.cc:458] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3 [sys.catalog]: This master's current role is: LEADER
I20250626 01:57:51.871493 11247 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250626 01:57:51.888280 11247 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250626 01:57:51.910712 11247 catalog_manager.cc:1349] Generated new cluster ID: 78f0534e97ca46249636e91ffd687abf
I20250626 01:57:51.911062 11247 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250626 01:57:51.944247 11247 catalog_manager.cc:1372] Generated new certificate authority record
I20250626 01:57:51.946259 11247 catalog_manager.cc:1506] Loading token signing keys...
I20250626 01:57:51.968791 11247 catalog_manager.cc:5955] T 00000000000000000000000000000000 P b25b267a56dc454997f429334e96abf3: Generated new TSK 0
I20250626 01:57:51.970333 11247 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250626 01:57:51.997867 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.10.62.129:0
--local_ip_for_outbound_sockets=127.10.62.129
--webserver_interface=127.10.62.129
--webserver_port=0
--tserver_master_addrs=127.10.62.190:42257
--builtin_ntp_servers=127.10.62.148:33005
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
W20250626 01:57:52.393025 11257 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:52.393729 11257 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:52.394237 11257 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250626 01:57:52.394469 11257 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250626 01:57:52.394868 11257 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:52.432778 11257 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:52.433948 11257 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.10.62.129
I20250626 01:57:52.476671 11257 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:33005
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.10.62.129:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.10.62.129
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.10.62.190:42257
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.10.62.129
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:52.478494 11257 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:52.481168 11257 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:52.504405 11264 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:52.509586 11257 server_base.cc:1048] running on GCE node
W20250626 01:57:52.508823 11266 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:52.508392 11263 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:53.792589 11257 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:53.796376 11257 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:53.798030 11257 hybrid_clock.cc:648] HybridClock initialized: now 1750903073797962 us; error 107 us; skew 500 ppm
I20250626 01:57:53.799042 11257 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:53.813577 11257 webserver.cc:469] Webserver started at http://127.10.62.129:40813/ using document root <none> and password file <none>
I20250626 01:57:53.814847 11257 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:53.815168 11257 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:53.815769 11257 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:53.821494 11257 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data/instance:
uuid: "37c2019541f54a949ff92c6e59946308"
format_stamp: "Formatted at 2025-06-26 01:57:53 on dist-test-slave-k90p"
I20250626 01:57:53.823213 11257 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal/instance:
uuid: "37c2019541f54a949ff92c6e59946308"
format_stamp: "Formatted at 2025-06-26 01:57:53 on dist-test-slave-k90p"
I20250626 01:57:53.833166 11257 fs_manager.cc:696] Time spent creating directory manager: real 0.009s	user 0.008s	sys 0.000s
I20250626 01:57:53.841171 11273 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:53.842792 11257 fs_manager.cc:730] Time spent opening block manager: real 0.006s	user 0.001s	sys 0.005s
I20250626 01:57:53.843358 11257 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal
uuid: "37c2019541f54a949ff92c6e59946308"
format_stamp: "Formatted at 2025-06-26 01:57:53 on dist-test-slave-k90p"
I20250626 01:57:53.843858 11257 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:53.916975 11257 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:53.918823 11257 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:53.919422 11257 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:53.923197 11257 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250626 01:57:53.928681 11257 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250626 01:57:53.929023 11257 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:53.929316 11257 ts_tablet_manager.cc:610] Registered 0 tablets
I20250626 01:57:53.929498 11257 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:54.108667 11257 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.129:41009
I20250626 01:57:54.108911 11385 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.129:41009 every 8 connection(s)
I20250626 01:57:54.112042 11257 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/data/info.pb
I20250626 01:57:54.121331 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 11257
I20250626 01:57:54.121803 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-0/wal/instance
I20250626 01:57:54.130554 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.10.62.130:0
--local_ip_for_outbound_sockets=127.10.62.130
--webserver_interface=127.10.62.130
--webserver_port=0
--tserver_master_addrs=127.10.62.190:42257
--builtin_ntp_servers=127.10.62.148:33005
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
I20250626 01:57:54.142467 11386 heartbeater.cc:344] Connected to a master server at 127.10.62.190:42257
I20250626 01:57:54.143256 11386 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:54.145021 11386 heartbeater.cc:507] Master 127.10.62.190:42257 requested a full tablet report, sending...
I20250626 01:57:54.149196 11198 ts_manager.cc:194] Registered new tserver with Master: 37c2019541f54a949ff92c6e59946308 (127.10.62.129:41009)
I20250626 01:57:54.151765 11198 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.10.62.129:48729
W20250626 01:57:54.439415 11229 debug-util.cc:398] Leaking SignalData structure 0x7b080006f1c0 after lost signal to thread 11166
W20250626 01:57:54.494185 11390 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:54.494784 11390 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:54.495337 11390 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250626 01:57:54.495558 11390 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250626 01:57:54.496137 11390 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:54.532478 11390 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:54.533608 11390 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.10.62.130
I20250626 01:57:54.576323 11390 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:33005
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.10.62.130:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.10.62.130
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.10.62.190:42257
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.10.62.130
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:54.578075 11390 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:54.580281 11390 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:54.603843 11397 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:55.157732 11386 heartbeater.cc:499] Master 127.10.62.190:42257 was elected leader, sending a full tablet report...
W20250626 01:57:54.605482 11399 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:54.604245 11396 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:56.382701 11398 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1777 milliseconds
I20250626 01:57:56.382891 11390 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250626 01:57:56.384658 11390 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:56.389346 11390 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:56.390945 11390 hybrid_clock.cc:648] HybridClock initialized: now 1750903076390891 us; error 82 us; skew 500 ppm
I20250626 01:57:56.392030 11390 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:56.401707 11390 webserver.cc:469] Webserver started at http://127.10.62.130:41223/ using document root <none> and password file <none>
I20250626 01:57:56.403019 11390 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:56.403391 11390 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:56.404016 11390 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:56.410006 11390 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data/instance:
uuid: "f990919f780a49e39097b5e2cd933557"
format_stamp: "Formatted at 2025-06-26 01:57:56 on dist-test-slave-k90p"
I20250626 01:57:56.411650 11390 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal/instance:
uuid: "f990919f780a49e39097b5e2cd933557"
format_stamp: "Formatted at 2025-06-26 01:57:56 on dist-test-slave-k90p"
I20250626 01:57:56.422005 11390 fs_manager.cc:696] Time spent creating directory manager: real 0.010s	user 0.006s	sys 0.004s
I20250626 01:57:56.431792 11407 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:56.433359 11390 fs_manager.cc:730] Time spent opening block manager: real 0.006s	user 0.004s	sys 0.000s
I20250626 01:57:56.433817 11390 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal
uuid: "f990919f780a49e39097b5e2cd933557"
format_stamp: "Formatted at 2025-06-26 01:57:56 on dist-test-slave-k90p"
I20250626 01:57:56.434237 11390 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:56.500624 11390 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:56.502732 11390 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:56.503410 11390 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:56.507416 11390 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250626 01:57:56.512902 11390 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250626 01:57:56.513182 11390 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:56.513522 11390 ts_tablet_manager.cc:610] Registered 0 tablets
I20250626 01:57:56.513698 11390 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:56.714920 11390 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.130:37021
I20250626 01:57:56.715144 11519 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.130:37021 every 8 connection(s)
I20250626 01:57:56.718194 11390 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/data/info.pb
I20250626 01:57:56.721568 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 11390
I20250626 01:57:56.722532 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-1/wal/instance
I20250626 01:57:56.733515 10490 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
/tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.10.62.131:0
--local_ip_for_outbound_sockets=127.10.62.131
--webserver_interface=127.10.62.131
--webserver_port=0
--tserver_master_addrs=127.10.62.190:42257
--builtin_ntp_servers=127.10.62.148:33005
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
I20250626 01:57:56.752558 11520 heartbeater.cc:344] Connected to a master server at 127.10.62.190:42257
I20250626 01:57:56.753196 11520 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:56.754619 11520 heartbeater.cc:507] Master 127.10.62.190:42257 requested a full tablet report, sending...
I20250626 01:57:56.757875 11198 ts_manager.cc:194] Registered new tserver with Master: f990919f780a49e39097b5e2cd933557 (127.10.62.130:37021)
I20250626 01:57:56.759565 11198 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.10.62.130:42195
W20250626 01:57:57.104836 11524 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250626 01:57:57.105563 11524 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250626 01:57:57.106086 11524 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250626 01:57:57.106352 11524 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250626 01:57:57.106748 11524 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250626 01:57:57.143575 11524 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250626 01:57:57.144750 11524 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.10.62.131
I20250626 01:57:57.186316 11524 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.10.62.148:33005
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.10.62.131:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.10.62.131
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.10.62.190:42257
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.10.62.131
--log_dir=/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision f7c956859e2f49c4cf1caffa969c1777a7a5d81c
build type FASTDEBUG
built by None at 26 Jun 2025 01:43:20 UTC on 5fd53c4cbb9d
build id 6773
TSAN enabled
I20250626 01:57:57.188251 11524 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250626 01:57:57.190574 11524 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250626 01:57:57.210894 11531 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250626 01:57:57.764106 11520 heartbeater.cc:499] Master 127.10.62.190:42257 was elected leader, sending a full tablet report...
W20250626 01:57:57.212459 11530 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:57.215286 11533 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250626 01:57:58.437640 11532 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250626 01:57:58.437825 11524 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250626 01:57:58.442728 11524 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250626 01:57:58.446322 11524 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250626 01:57:58.447932 11524 hybrid_clock.cc:648] HybridClock initialized: now 1750903078447879 us; error 79 us; skew 500 ppm
I20250626 01:57:58.449043 11524 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250626 01:57:58.457528 11524 webserver.cc:469] Webserver started at http://127.10.62.131:45861/ using document root <none> and password file <none>
I20250626 01:57:58.458869 11524 fs_manager.cc:362] Metadata directory not provided
I20250626 01:57:58.459245 11524 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250626 01:57:58.459887 11524 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250626 01:57:58.465739 11524 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data/instance:
uuid: "c1be94bb90e44876a3647fd124cf6adf"
format_stamp: "Formatted at 2025-06-26 01:57:58 on dist-test-slave-k90p"
I20250626 01:57:58.467446 11524 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal/instance:
uuid: "c1be94bb90e44876a3647fd124cf6adf"
format_stamp: "Formatted at 2025-06-26 01:57:58 on dist-test-slave-k90p"
I20250626 01:57:58.477726 11524 fs_manager.cc:696] Time spent creating directory manager: real 0.009s	user 0.005s	sys 0.004s
I20250626 01:57:58.485850 11540 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:58.487531 11524 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.006s	sys 0.000s
I20250626 01:57:58.488031 11524 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data,/tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal
uuid: "c1be94bb90e44876a3647fd124cf6adf"
format_stamp: "Formatted at 2025-06-26 01:57:58 on dist-test-slave-k90p"
I20250626 01:57:58.488474 11524 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250626 01:57:58.563164 11524 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250626 01:57:58.565191 11524 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250626 01:57:58.565851 11524 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250626 01:57:58.569303 11524 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250626 01:57:58.574748 11524 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250626 01:57:58.575120 11524 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:58.575436 11524 ts_tablet_manager.cc:610] Registered 0 tablets
I20250626 01:57:58.575610 11524 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250626 01:57:58.771311 11524 rpc_server.cc:307] RPC server started. Bound to: 127.10.62.131:41765
I20250626 01:57:58.771482 11652 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.10.62.131:41765 every 8 connection(s)
I20250626 01:57:58.774928 11524 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/data/info.pb
I20250626 01:57:58.783725 10490 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu as pid 11524
I20250626 01:57:58.784318 10490 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0/minicluster-data/ts-2/wal/instance
I20250626 01:57:58.805213 11653 heartbeater.cc:344] Connected to a master server at 127.10.62.190:42257
I20250626 01:57:58.805883 11653 heartbeater.cc:461] Registering TS with master...
I20250626 01:57:58.808070 11653 heartbeater.cc:507] Master 127.10.62.190:42257 requested a full tablet report, sending...
I20250626 01:57:58.811653 11198 ts_manager.cc:194] Registered new tserver with Master: c1be94bb90e44876a3647fd124cf6adf (127.10.62.131:41765)
I20250626 01:57:58.813266 11198 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.10.62.131:49051
I20250626 01:57:58.825224 10490 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250626 01:57:58.874264 10490 test_util.cc:276] Using random seed: -468882314
I20250626 01:57:58.928583 11198 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:40968:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20250626 01:57:58.931892 11198 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250626 01:57:59.013935 11321 tablet_service.cc:1468] Processing CreateTablet for tablet 50dded91450341c1bd9eeeb411f6be6c (DEFAULT_TABLE table=test-workload [id=424996e981104a2ca40d5f79bc7a4238]), partition=RANGE (key) PARTITION UNBOUNDED
I20250626 01:57:59.017571 11321 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 50dded91450341c1bd9eeeb411f6be6c. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:59.017197 11588 tablet_service.cc:1468] Processing CreateTablet for tablet 50dded91450341c1bd9eeeb411f6be6c (DEFAULT_TABLE table=test-workload [id=424996e981104a2ca40d5f79bc7a4238]), partition=RANGE (key) PARTITION UNBOUNDED
I20250626 01:57:59.020181 11588 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 50dded91450341c1bd9eeeb411f6be6c. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:59.020181 11455 tablet_service.cc:1468] Processing CreateTablet for tablet 50dded91450341c1bd9eeeb411f6be6c (DEFAULT_TABLE table=test-workload [id=424996e981104a2ca40d5f79bc7a4238]), partition=RANGE (key) PARTITION UNBOUNDED
I20250626 01:57:59.022858 11455 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 50dded91450341c1bd9eeeb411f6be6c. 1 dirs total, 0 dirs full, 0 dirs failed
I20250626 01:57:59.053032 11677 tablet_bootstrap.cc:492] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: Bootstrap starting.
I20250626 01:57:59.058920 11678 tablet_bootstrap.cc:492] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: Bootstrap starting.
I20250626 01:57:59.062692 11679 tablet_bootstrap.cc:492] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: Bootstrap starting.
I20250626 01:57:59.064754 11677 tablet_bootstrap.cc:654] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:59.068058 11677 log.cc:826] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:59.070816 11678 tablet_bootstrap.cc:654] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:59.073246 11679 tablet_bootstrap.cc:654] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: Neither blocks nor log segments found. Creating new log.
I20250626 01:57:59.074359 11678 log.cc:826] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:59.075871 11677 tablet_bootstrap.cc:492] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: No bootstrap required, opened a new log
I20250626 01:57:59.076613 11677 ts_tablet_manager.cc:1397] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: Time spent bootstrapping tablet: real 0.024s	user 0.006s	sys 0.016s
I20250626 01:57:59.076676 11679 log.cc:826] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: Log is configured to *not* fsync() on all Append() calls
I20250626 01:57:59.084426 11678 tablet_bootstrap.cc:492] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: No bootstrap required, opened a new log
I20250626 01:57:59.084510 11679 tablet_bootstrap.cc:492] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: No bootstrap required, opened a new log
I20250626 01:57:59.085192 11679 ts_tablet_manager.cc:1397] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: Time spent bootstrapping tablet: real 0.023s	user 0.010s	sys 0.011s
I20250626 01:57:59.085192 11678 ts_tablet_manager.cc:1397] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: Time spent bootstrapping tablet: real 0.027s	user 0.016s	sys 0.008s
I20250626 01:57:59.100729 11677 raft_consensus.cc:357] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.101730 11677 raft_consensus.cc:383] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:59.102095 11677 raft_consensus.cc:738] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 37c2019541f54a949ff92c6e59946308, State: Initialized, Role: FOLLOWER
I20250626 01:57:59.103204 11677 consensus_queue.cc:260] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.113433 11677 ts_tablet_manager.cc:1428] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: Time spent starting tablet: real 0.037s	user 0.018s	sys 0.018s
I20250626 01:57:59.118304 11678 raft_consensus.cc:357] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.119837 11678 raft_consensus.cc:383] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:59.120360 11678 raft_consensus.cc:738] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c1be94bb90e44876a3647fd124cf6adf, State: Initialized, Role: FOLLOWER
I20250626 01:57:59.121917 11678 consensus_queue.cc:260] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
W20250626 01:57:59.125053 11387 tablet.cc:2378] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250626 01:57:59.125978 11679 raft_consensus.cc:357] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.127391 11679 raft_consensus.cc:383] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250626 01:57:59.127756 11679 raft_consensus.cc:738] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f990919f780a49e39097b5e2cd933557, State: Initialized, Role: FOLLOWER
I20250626 01:57:59.128942 11679 consensus_queue.cc:260] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.141758 11653 heartbeater.cc:499] Master 127.10.62.190:42257 was elected leader, sending a full tablet report...
I20250626 01:57:59.142796 11678 ts_tablet_manager.cc:1428] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: Time spent starting tablet: real 0.057s	user 0.051s	sys 0.003s
I20250626 01:57:59.149312 11679 ts_tablet_manager.cc:1428] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: Time spent starting tablet: real 0.064s	user 0.030s	sys 0.023s
W20250626 01:57:59.228070 11521 tablet.cc:2378] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250626 01:57:59.281929 11654 tablet.cc:2378] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250626 01:57:59.450693 11683 raft_consensus.cc:491] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250626 01:57:59.451412 11683 raft_consensus.cc:513] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.454782 11683 leader_election.cc:290] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers c1be94bb90e44876a3647fd124cf6adf (127.10.62.131:41765), f990919f780a49e39097b5e2cd933557 (127.10.62.130:37021)
I20250626 01:57:59.467175 11685 raft_consensus.cc:491] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250626 01:57:59.467983 11685 raft_consensus.cc:513] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.471014 11475 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c" candidate_uuid: "37c2019541f54a949ff92c6e59946308" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f990919f780a49e39097b5e2cd933557" is_pre_election: true
I20250626 01:57:59.472066 11475 raft_consensus.cc:2466] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 37c2019541f54a949ff92c6e59946308 in term 0.
I20250626 01:57:59.472257 11608 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c" candidate_uuid: "37c2019541f54a949ff92c6e59946308" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "c1be94bb90e44876a3647fd124cf6adf" is_pre_election: true
I20250626 01:57:59.473471 11608 raft_consensus.cc:2466] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 37c2019541f54a949ff92c6e59946308 in term 0.
I20250626 01:57:59.474431 11276 leader_election.cc:304] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 37c2019541f54a949ff92c6e59946308, f990919f780a49e39097b5e2cd933557; no voters: 
I20250626 01:57:59.475905 11683 raft_consensus.cc:2802] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250626 01:57:59.476519 11683 raft_consensus.cc:491] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250626 01:57:59.476969 11683 raft_consensus.cc:3058] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:59.477178 11685 leader_election.cc:290] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 37c2019541f54a949ff92c6e59946308 (127.10.62.129:41009), c1be94bb90e44876a3647fd124cf6adf (127.10.62.131:41765)
I20250626 01:57:59.487062 11683 raft_consensus.cc:513] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.489992 11683 leader_election.cc:290] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [CANDIDATE]: Term 1 election: Requested vote from peers c1be94bb90e44876a3647fd124cf6adf (127.10.62.131:41765), f990919f780a49e39097b5e2cd933557 (127.10.62.130:37021)
I20250626 01:57:59.491328 11608 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c" candidate_uuid: "37c2019541f54a949ff92c6e59946308" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "c1be94bb90e44876a3647fd124cf6adf"
I20250626 01:57:59.491859 11341 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c" candidate_uuid: "f990919f780a49e39097b5e2cd933557" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "37c2019541f54a949ff92c6e59946308" is_pre_election: true
I20250626 01:57:59.492184 11608 raft_consensus.cc:3058] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:59.493227 11341 raft_consensus.cc:2391] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate f990919f780a49e39097b5e2cd933557 in current term 1: Already voted for candidate 37c2019541f54a949ff92c6e59946308 in this term.
I20250626 01:57:59.493260 11475 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c" candidate_uuid: "37c2019541f54a949ff92c6e59946308" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f990919f780a49e39097b5e2cd933557"
I20250626 01:57:59.493991 11475 raft_consensus.cc:3058] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 0 FOLLOWER]: Advancing to term 1
I20250626 01:57:59.496315 11607 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c" candidate_uuid: "f990919f780a49e39097b5e2cd933557" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "c1be94bb90e44876a3647fd124cf6adf" is_pre_election: true
I20250626 01:57:59.499408 11408 leader_election.cc:304] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: f990919f780a49e39097b5e2cd933557; no voters: 37c2019541f54a949ff92c6e59946308, c1be94bb90e44876a3647fd124cf6adf
I20250626 01:57:59.500413 11608 raft_consensus.cc:2466] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 37c2019541f54a949ff92c6e59946308 in term 1.
I20250626 01:57:59.502445 11274 leader_election.cc:304] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 37c2019541f54a949ff92c6e59946308, c1be94bb90e44876a3647fd124cf6adf; no voters: 
I20250626 01:57:59.503371 11475 raft_consensus.cc:2466] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 37c2019541f54a949ff92c6e59946308 in term 1.
I20250626 01:57:59.503525 11683 raft_consensus.cc:2802] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 1 FOLLOWER]: Leader election won for term 1
I20250626 01:57:59.503911 11685 raft_consensus.cc:2747] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 1 FOLLOWER]: Leader pre-election lost for term 1. Reason: could not achieve majority
I20250626 01:57:59.506796 11683 raft_consensus.cc:695] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 1 LEADER]: Becoming Leader. State: Replica: 37c2019541f54a949ff92c6e59946308, State: Running, Role: LEADER
I20250626 01:57:59.508093 11683 consensus_queue.cc:237] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } }
I20250626 01:57:59.523923 11198 catalog_manager.cc:5582] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 reported cstate change: term changed from 0 to 1, leader changed from <none> to 37c2019541f54a949ff92c6e59946308 (127.10.62.129). New cstate: current_term: 1 leader_uuid: "37c2019541f54a949ff92c6e59946308" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "37c2019541f54a949ff92c6e59946308" member_type: VOTER last_known_addr { host: "127.10.62.129" port: 41009 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 } health_report { overall_health: UNKNOWN } } }
I20250626 01:57:59.638582 11475 raft_consensus.cc:1273] T 50dded91450341c1bd9eeeb411f6be6c P f990919f780a49e39097b5e2cd933557 [term 1 FOLLOWER]: Refusing update from remote peer 37c2019541f54a949ff92c6e59946308: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250626 01:57:59.638487 11608 raft_consensus.cc:1273] T 50dded91450341c1bd9eeeb411f6be6c P c1be94bb90e44876a3647fd124cf6adf [term 1 FOLLOWER]: Refusing update from remote peer 37c2019541f54a949ff92c6e59946308: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250626 01:57:59.641036 11690 consensus_queue.cc:1035] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c1be94bb90e44876a3647fd124cf6adf" member_type: VOTER last_known_addr { host: "127.10.62.131" port: 41765 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.001s
I20250626 01:57:59.642133 11683 consensus_queue.cc:1035] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f990919f780a49e39097b5e2cd933557" member_type: VOTER last_known_addr { host: "127.10.62.130" port: 37021 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250626 01:57:59.681409 11697 mvcc.cc:204] Tried to move back new op lower bound from 7171699014175076352 to 7171699013674680320. Current Snapshot: MvccSnapshot[applied={T|T < 7171699014175076352}]
I20250626 01:57:59.687433 11699 mvcc.cc:204] Tried to move back new op lower bound from 7171699014175076352 to 7171699013674680320. Current Snapshot: MvccSnapshot[applied={T|T < 7171699014175076352}]
I20250626 01:57:59.763418 11341 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "50dded91450341c1bd9eeeb411f6be6c"
dest_uuid: "37c2019541f54a949ff92c6e59946308"
 from {username='slave'} at 127.0.0.1:58074
I20250626 01:57:59.764071 11341 raft_consensus.cc:474] T 50dded91450341c1bd9eeeb411f6be6c P 37c2019541f54a949ff92c6e59946308 [term 1 LEADER]: Not starting forced leader election -- already a leader
W20250626 01:58:04.654109 11387 tablet_replica_mm_ops.cc:240] Deltamemstore flush is disabled (check --enable_flush_deltamemstores)
W20250626 01:58:04.654542 11387 tablet_replica_mm_ops.cc:163] Memrowset flush is disabled (check --enable_flush_memrowset)
W20250626 01:58:04.767731 11521 tablet_replica_mm_ops.cc:240] Deltamemstore flush is disabled (check --enable_flush_deltamemstores)
W20250626 01:58:04.768168 11521 tablet_replica_mm_ops.cc:163] Memrowset flush is disabled (check --enable_flush_memrowset)
W20250626 01:58:04.811514 11654 tablet_replica_mm_ops.cc:240] Deltamemstore flush is disabled (check --enable_flush_deltamemstores)
W20250626 01:58:04.811868 11654 tablet_replica_mm_ops.cc:163] Memrowset flush is disabled (check --enable_flush_memrowset)
W20250626 01:58:14.377734 11649 debug-util.cc:398] Leaking SignalData structure 0x7b08000baca0 after lost signal to thread 11525
W20250626 01:58:14.379360 11649 debug-util.cc:398] Leaking SignalData structure 0x7b08000ac220 after lost signal to thread 11652
W20250626 01:58:56.498310 11382 debug-util.cc:398] Leaking SignalData structure 0x7b08000c80c0 after lost signal to thread 11258
W20250626 01:58:56.500003 11382 debug-util.cc:398] Leaking SignalData structure 0x7b08000ce320 after lost signal to thread 11385
W20250626 01:58:56.767365 11715 meta_cache.cc:1261] Time spent looking up entry by key: real 0.072s	user 0.004s	sys 0.000s
W20250626 01:58:57.953472 11711 meta_cache.cc:1261] Time spent looking up entry by key: real 0.071s	user 0.003s	sys 0.000s
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:2151: Failure
Failed
Bad status: Timed out: Timed out waiting for number of WAL segments on tablet 50dded91450341c1bd9eeeb411f6be6c on TS 0 to be 6. Found 5
I20250626 01:59:00.045675 10490 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20250626 01:59:00.046149 10490 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 37c2019541f54a949ff92c6e59946308 and pid 11257
************************ BEGIN STACKS **************************
[New LWP 11258]
[New LWP 11259]
[New LWP 11260]
[New LWP 11261]
[New LWP 11262]
[New LWP 11269]
[New LWP 11270]
[New LWP 11271]
[New LWP 11274]
[New LWP 11275]
[New LWP 11276]
[New LWP 11277]
[New LWP 11278]
[New LWP 11279]
[New LWP 11280]
[New LWP 11281]
[New LWP 11282]
[New LWP 11283]
[New LWP 11284]
[New LWP 11285]
[New LWP 11286]
[New LWP 11287]
[New LWP 11288]
[New LWP 11289]
[New LWP 11290]
[New LWP 11291]
[New LWP 11292]
[New LWP 11293]
[New LWP 11294]
[New LWP 11295]
[New LWP 11296]
[New LWP 11297]
[New LWP 11298]
[New LWP 11299]
[New LWP 11300]
[New LWP 11301]
[New LWP 11302]
[New LWP 11303]
[New LWP 11304]
[New LWP 11305]
[New LWP 11306]
[New LWP 11307]
[New LWP 11308]
[New LWP 11309]
[New LWP 11310]
[New LWP 11311]
[New LWP 11312]
[New LWP 11313]
[New LWP 11314]
[New LWP 11315]
[New LWP 11316]
[New LWP 11317]
[New LWP 11318]
[New LWP 11319]
[New LWP 11320]
[New LWP 11321]
[New LWP 11322]
[New LWP 11323]
[New LWP 11324]
[New LWP 11325]
[New LWP 11326]
[New LWP 11327]
[New LWP 11328]
[New LWP 11329]
[New LWP 11330]
[New LWP 11331]
[New LWP 11332]
[New LWP 11333]
[New LWP 11334]
[New LWP 11335]
[New LWP 11336]
[New LWP 11337]
[New LWP 11338]
[New LWP 11339]
[New LWP 11340]
[New LWP 11341]
[New LWP 11342]
[New LWP 11343]
[New LWP 11344]
[New LWP 11345]
[New LWP 11346]
[New LWP 11347]
[New LWP 11348]
[New LWP 11349]
[New LWP 11350]
[New LWP 11351]
[New LWP 11352]
[New LWP 11353]
[New LWP 11354]
[New LWP 11355]
[New LWP 11356]
[New LWP 11357]
[New LWP 11358]
[New LWP 11359]
[New LWP 11360]
[New LWP 11361]
[New LWP 11362]
[New LWP 11363]
[New LWP 11364]
[New LWP 11365]
[New LWP 11366]
[New LWP 11367]
[New LWP 11368]
[New LWP 11369]
[New LWP 11370]
[New LWP 11371]
[New LWP 11372]
[New LWP 11373]
[New LWP 11374]
[New LWP 11375]
[New LWP 11376]
[New LWP 11377]
[New LWP 11378]
[New LWP 11379]
[New LWP 11380]
[New LWP 11381]
[New LWP 11382]
[New LWP 11383]
[New LWP 11384]
[New LWP 11385]
[New LWP 11386]
[New LWP 11387]
[New LWP 11864]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f267212dd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 11257 "kudu"  0x00007f267212dd50 in ?? ()
  2    LWP 11258 "kudu"  0x00007f266d4f17a0 in ?? ()
  3    LWP 11259 "kudu"  0x00007f2672129fb9 in ?? ()
  4    LWP 11260 "kudu"  0x00007f2672129fb9 in ?? ()
  5    LWP 11261 "kudu"  0x00007f2672129fb9 in ?? ()
  6    LWP 11262 "kernel-watcher-" 0x00007f2672129fb9 in ?? ()
  7    LWP 11269 "ntp client-1126" 0x00007f267212d9e2 in ?? ()
  8    LWP 11270 "file cache-evic" 0x00007f2672129fb9 in ?? ()
  9    LWP 11271 "sq_acceptor" 0x00007f266d521cb9 in ?? ()
  10   LWP 11274 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  11   LWP 11275 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  12   LWP 11276 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  13   LWP 11277 "rpc reactor-112" 0x00007f266d52ea47 in ?? ()
  14   LWP 11278 "MaintenanceMgr " 0x00007f2672129ad3 in ?? ()
  15   LWP 11279 "txn-status-mana" 0x00007f2672129fb9 in ?? ()
  16   LWP 11280 "collect_and_rem" 0x00007f2672129fb9 in ?? ()
  17   LWP 11281 "tc-session-exp-" 0x00007f2672129fb9 in ?? ()
  18   LWP 11282 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  19   LWP 11283 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  20   LWP 11284 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  21   LWP 11285 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  22   LWP 11286 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  23   LWP 11287 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  24   LWP 11288 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  25   LWP 11289 "rpc worker-1128" 0x00007f2672129ad3 in ?? ()
  26   LWP 11290 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  27   LWP 11291 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  28   LWP 11292 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  29   LWP 11293 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  30   LWP 11294 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  31   LWP 11295 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  32   LWP 11296 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  33   LWP 11297 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  34   LWP 11298 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  35   LWP 11299 "rpc worker-1129" 0x00007f2672129ad3 in ?? ()
  36   LWP 11300 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  37   LWP 11301 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  38   LWP 11302 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  39   LWP 11303 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  40   LWP 11304 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  41   LWP 11305 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  42   LWP 11306 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  43   LWP 11307 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  44   LWP 11308 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  45   LWP 11309 "rpc worker-1130" 0x00007f2672129ad3 in ?? ()
  46   LWP 11310 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  47   LWP 11311 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  48   LWP 11312 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  49   LWP 11313 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  50   LWP 11314 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  51   LWP 11315 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  52   LWP 11316 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  53   LWP 11317 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  54   LWP 11318 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  55   LWP 11319 "rpc worker-1131" 0x00007f2672129ad3 in ?? ()
  56   LWP 11320 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  57   LWP 11321 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  58   LWP 11322 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  59   LWP 11323 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  60   LWP 11324 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  61   LWP 11325 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  62   LWP 11326 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  63   LWP 11327 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  64   LWP 11328 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  65   LWP 11329 "rpc worker-1132" 0x00007f2672129ad3 in ?? ()
  66   LWP 11330 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  67   LWP 11331 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  68   LWP 11332 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  69   LWP 11333 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  70   LWP 11334 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  71   LWP 11335 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  72   LWP 11336 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  73   LWP 11337 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  74   LWP 11338 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  75   LWP 11339 "rpc worker-1133" 0x00007f2672129ad3 in ?? ()
  76   LWP 11340 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  77   LWP 11341 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  78   LWP 11342 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  79   LWP 11343 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  80   LWP 11344 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  81   LWP 11345 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  82   LWP 11346 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  83   LWP 11347 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  84   LWP 11348 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  85   LWP 11349 "rpc worker-1134" 0x00007f2672129ad3 in ?? ()
  86   LWP 11350 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  87   LWP 11351 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  88   LWP 11352 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  89   LWP 11353 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  90   LWP 11354 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  91   LWP 11355 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  92   LWP 11356 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  93   LWP 11357 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  94   LWP 11358 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  95   LWP 11359 "rpc worker-1135" 0x00007f2672129ad3 in ?? ()
  96   LWP 11360 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  97   LWP 11361 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  98   LWP 11362 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  99   LWP 11363 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  100  LWP 11364 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  101  LWP 11365 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  102  LWP 11366 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  103  LWP 11367 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  104  LWP 11368 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  105  LWP 11369 "rpc worker-1136" 0x00007f2672129ad3 in ?? ()
  106  LWP 11370 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  107  LWP 11371 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  108  LWP 11372 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  109  LWP 11373 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  110  LWP 11374 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  111  LWP 11375 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  112  LWP 11376 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  113  LWP 11377 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  114  LWP 11378 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  115  LWP 11379 "rpc worker-1137" 0x00007f2672129ad3 in ?? ()
  116  LWP 11380 "rpc worker-1138" 0x00007f2672129ad3 in ?? ()
  117  LWP 11381 "rpc worker-1138" 0x00007f2672129ad3 in ?? ()
  118  LWP 11382 "diag-logger-113" 0x00007f2672129fb9 in ?? ()
  119  LWP 11383 "result-tracker-" 0x00007f2672129fb9 in ?? ()
  120  LWP 11384 "excess-log-dele" 0x00007f2672129fb9 in ?? ()
  121  LWP 11385 "acceptor-11385" 0x00007f266d5300c7 in ?? ()
  122  LWP 11386 "heartbeat-11386" 0x00007f2672129fb9 in ?? ()
  123  LWP 11387 "maintenance_sch" 0x00007f2672129fb9 in ?? ()
  124  LWP 11864 "raft [worker]-1" 0x00007f2672129fb9 in ?? ()

Thread 124 (LWP 11864):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 123 (LWP 11387):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f0100000000 in ?? ()
#2  0x000000000000010a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f26263b96c0 in ?? ()
#6  0x0000000000000214 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 11386):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 11385):
#0  0x00007f266d5300c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 11384):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f2627bbc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcc319ba00 in ?? ()
#5  0x00007f2627bbc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 11383):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000085352fb8 in ?? ()
#2  0x0000000000000042 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f26283bd800 in ?? ()
#6  0x0000000000000084 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 11382):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f266b488008 in ?? ()
#2  0x000000000000003e in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f2628bbe750 in ?? ()
#6  0x000000000000007c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 11381):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 11380):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 11379):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 11378):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 11377):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 11376):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 11375):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 11374):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 11373):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 11372):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 11371):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 11370):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 11369):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 11368):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 11367):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 11366):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 11365):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 11364):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 11363):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 11362):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 11361):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 11360):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 11359):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 11358):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 11357):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 11356):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 11355):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 11354):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 11353):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 11352):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 11351):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 11350):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 11349):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 11348):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 11347):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 11346):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 11345):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 11344):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 11343):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 11342):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 11341):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f263ddba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f263ddba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 11340):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 11339):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 11338):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 11337):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 11336):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 11335):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 11334):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 11333):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 11332):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 11331):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 11330):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 11329):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 11328):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 11327):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 11326):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 11325):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 11324):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 11323):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 11322):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 11321):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f26481bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f26481bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c8 in ?? ()
#9  0x00007f2672129770 in ?? ()
#10 0x00007f26481bc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 11320):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 11319):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 11318):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 11317):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 11316):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 11315):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 11314):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 11313):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 11312):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 11311):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 11310):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 11309):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 11308):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 11307):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 11306):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 11305):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 11304):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 11303):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 11302):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 11301):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000167 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f26525be710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f26525be730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f2672129770 in ?? ()
#10 0x00007f26525be730 in ?? ()
#11 0x00007f26321adca0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 11300):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000290 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b240005d7f8 in ?? ()
#4  0x00007f2652fb6710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f2652fb6730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 35 (LWP 11299):
#0  0x00007f2672129ad3 in ?? ()
#1  0x00000000000003dd in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b2400058ffc in ?? ()
#4  0x00007f26537b7710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f26537b7730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f2672129770 in ?? ()
#10 0x00007f26537b7730 in ?? ()
#11 0x00007f263217eaa0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 11298):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 11297):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 11296):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 11295):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 11294):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 11293):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 11292):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 11291):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 11290):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 11289):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 11288):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 11287):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 11286):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 11285):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 11284):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 11283):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 11282):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 11281):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000017a335f0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f265cb92700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 11280):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f265d3939a8 in ?? ()
#2  0x000000000000000d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400037198 in ?? ()
#5  0x00007f265d393840 in ?? ()
#6  0x000000000000001a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 11279):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f265db94410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 11278):
#0  0x00007f2672129ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 11277):
#0  0x00007f266d52ea47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 11276):
#0  0x00007f266d52ea47 in ?? ()
#1  0x00007b280002a128 in ?? ()
#2  0x0044e000029c2742 in ?? ()
#3  0x00007f265f397500 in ?? ()
#4  0x00007f265f398b80 in ?? ()
#5  0x00007f265f397500 in ?? ()
#6  0x0000000000000011 in ?? ()
#7  0x00007b5800001800 in ?? ()
#8  0x0000000000488695 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f266aca2000 in ?? ()
#10 0x0000000000488599 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007f265f398b80 in ?? ()
#12 0x00007f266ff87069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007f26757081a0 in ?? ()
#15 0x00007b4c00002f90 in ?? ()
#16 0x00007b4c00002f98 in ?? ()
#17 0x00007f265f3977a0 in ?? ()
#18 0x00007b4400033d00 in ?? ()
#19 0x00007f265f397cd0 in ?? ()
#20 0x0000000000000000 in ?? ()

Thread 11 (LWP 11275):
#0  0x00007f266d52ea47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 11274):
#0  0x00007f266d52ea47 in ?? ()
#1  0x00007b5800010408 in ?? ()
#2  0x003ce00001c280a3 in ?? ()
#3  0x00007f2662bbe500 in ?? ()
#4  0x00007f2662bbfb80 in ?? ()
#5  0x00007f2662bbe500 in ?? ()
#6  0x000000000000000d in ?? ()
#7  0x00007b5800000f00 in ?? ()
#8  0x0000000000488695 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f266acc6000 in ?? ()
#10 0x0000000000488599 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007f2662bbfb80 in ?? ()
#12 0x00007f266ff87069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007f26757081a0 in ?? ()
#15 0x00007b4c00002c10 in ?? ()
#16 0x00007b4c00002c18 in ?? ()
#17 0x00007f2662bbe7a0 in ?? ()
#18 0x00007b4400036a00 in ?? ()
#19 0x00007f2662bbecd0 in ?? ()
#20 0x0000000000000000 in ?? ()

Thread 9 (LWP 11271):
#0  0x00007f266d521cb9 in ?? ()
#1  0x00007f26663bcc10 in ?? ()
#2  0x00007b040000a860 in ?? ()
#3  0x00007f26663bdb80 in ?? ()
#4  0x00007f26663bcc10 in ?? ()
#5  0x00007b040000a860 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f266ae5a000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f26663bdb80 in ?? ()
#10 0x00007f2676f06b28 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 11270):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 7 (LWP 11269):
#0  0x00007f267212d9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 11262):
#0  0x00007f2672129fb9 in ?? ()
#1  0x00007f26673bea40 in ?? ()
#2  0x0000000000000154 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000361d8 in ?? ()
#5  0x00007f26673be5d0 in ?? ()
#6  0x00000000000002a8 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 11261):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 11260):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 11259):
#0  0x00007f2672129fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 11258):
#0  0x00007f266d4f17a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 11257):
#0  0x00007f267212dd50 in ?? ()
#1  0x00007ffcc319b870 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f266c74fcc0 in ?? ()
#4  0x00007f266c74fcc0 in ?? ()
#5  0x00007ffcc319b810 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0xffffffff00a96d0f in ?? ()
#9  0x00007f266c74fcc0 in ?? ()
#10 0x00007f2670655f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250626 01:59:01.247025 10490 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID f990919f780a49e39097b5e2cd933557 and pid 11390
************************ BEGIN STACKS **************************
[New LWP 11391]
[New LWP 11392]
[New LWP 11393]
[New LWP 11394]
[New LWP 11395]
[New LWP 11403]
[New LWP 11404]
[New LWP 11405]
[New LWP 11408]
[New LWP 11409]
[New LWP 11410]
[New LWP 11411]
[New LWP 11412]
[New LWP 11413]
[New LWP 11414]
[New LWP 11415]
[New LWP 11416]
[New LWP 11417]
[New LWP 11418]
[New LWP 11419]
[New LWP 11420]
[New LWP 11421]
[New LWP 11422]
[New LWP 11423]
[New LWP 11424]
[New LWP 11425]
[New LWP 11426]
[New LWP 11427]
[New LWP 11428]
[New LWP 11429]
[New LWP 11430]
[New LWP 11431]
[New LWP 11432]
[New LWP 11433]
[New LWP 11434]
[New LWP 11435]
[New LWP 11436]
[New LWP 11437]
[New LWP 11438]
[New LWP 11439]
[New LWP 11440]
[New LWP 11441]
[New LWP 11442]
[New LWP 11443]
[New LWP 11444]
[New LWP 11445]
[New LWP 11446]
[New LWP 11447]
[New LWP 11448]
[New LWP 11449]
[New LWP 11450]
[New LWP 11451]
[New LWP 11452]
[New LWP 11453]
[New LWP 11454]
[New LWP 11455]
[New LWP 11456]
[New LWP 11457]
[New LWP 11458]
[New LWP 11459]
[New LWP 11460]
[New LWP 11461]
[New LWP 11462]
[New LWP 11463]
[New LWP 11464]
[New LWP 11465]
[New LWP 11466]
[New LWP 11467]
[New LWP 11468]
[New LWP 11469]
[New LWP 11470]
[New LWP 11471]
[New LWP 11472]
[New LWP 11473]
[New LWP 11474]
[New LWP 11475]
[New LWP 11476]
[New LWP 11477]
[New LWP 11478]
[New LWP 11479]
[New LWP 11480]
[New LWP 11481]
[New LWP 11482]
[New LWP 11483]
[New LWP 11484]
[New LWP 11485]
[New LWP 11486]
[New LWP 11487]
[New LWP 11488]
[New LWP 11489]
[New LWP 11490]
[New LWP 11491]
[New LWP 11492]
[New LWP 11493]
[New LWP 11494]
[New LWP 11495]
[New LWP 11496]
[New LWP 11497]
[New LWP 11498]
[New LWP 11499]
[New LWP 11500]
[New LWP 11501]
[New LWP 11502]
[New LWP 11503]
[New LWP 11504]
[New LWP 11505]
[New LWP 11506]
[New LWP 11507]
[New LWP 11508]
[New LWP 11509]
[New LWP 11510]
[New LWP 11511]
[New LWP 11512]
[New LWP 11513]
[New LWP 11514]
[New LWP 11515]
[New LWP 11516]
[New LWP 11517]
[New LWP 11518]
[New LWP 11519]
[New LWP 11520]
[New LWP 11521]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f7923b56d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 11390 "kudu"  0x00007f7923b56d50 in ?? ()
  2    LWP 11391 "kudu"  0x00007f791ef1a7a0 in ?? ()
  3    LWP 11392 "kudu"  0x00007f7923b52fb9 in ?? ()
  4    LWP 11393 "kudu"  0x00007f7923b52fb9 in ?? ()
  5    LWP 11394 "kudu"  0x00007f7923b52fb9 in ?? ()
  6    LWP 11395 "kernel-watcher-" 0x00007f7923b52fb9 in ?? ()
  7    LWP 11403 "ntp client-1140" 0x00007f7923b569e2 in ?? ()
  8    LWP 11404 "file cache-evic" 0x00007f7923b52fb9 in ?? ()
  9    LWP 11405 "sq_acceptor" 0x00007f791ef4acb9 in ?? ()
  10   LWP 11408 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  11   LWP 11409 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  12   LWP 11410 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  13   LWP 11411 "rpc reactor-114" 0x00007f791ef57a47 in ?? ()
  14   LWP 11412 "MaintenanceMgr " 0x00007f7923b52ad3 in ?? ()
  15   LWP 11413 "txn-status-mana" 0x00007f7923b52fb9 in ?? ()
  16   LWP 11414 "collect_and_rem" 0x00007f7923b52fb9 in ?? ()
  17   LWP 11415 "tc-session-exp-" 0x00007f7923b52fb9 in ?? ()
  18   LWP 11416 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  19   LWP 11417 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  20   LWP 11418 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  21   LWP 11419 "rpc worker-1141" 0x00007f7923b52ad3 in ?? ()
  22   LWP 11420 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  23   LWP 11421 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  24   LWP 11422 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  25   LWP 11423 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  26   LWP 11424 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  27   LWP 11425 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  28   LWP 11426 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  29   LWP 11427 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  30   LWP 11428 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  31   LWP 11429 "rpc worker-1142" 0x00007f7923b52ad3 in ?? ()
  32   LWP 11430 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  33   LWP 11431 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  34   LWP 11432 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  35   LWP 11433 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  36   LWP 11434 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  37   LWP 11435 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  38   LWP 11436 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  39   LWP 11437 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  40   LWP 11438 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  41   LWP 11439 "rpc worker-1143" 0x00007f7923b52ad3 in ?? ()
  42   LWP 11440 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  43   LWP 11441 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  44   LWP 11442 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  45   LWP 11443 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  46   LWP 11444 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  47   LWP 11445 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  48   LWP 11446 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  49   LWP 11447 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  50   LWP 11448 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  51   LWP 11449 "rpc worker-1144" 0x00007f7923b52ad3 in ?? ()
  52   LWP 11450 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  53   LWP 11451 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  54   LWP 11452 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  55   LWP 11453 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  56   LWP 11454 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  57   LWP 11455 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  58   LWP 11456 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  59   LWP 11457 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  60   LWP 11458 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  61   LWP 11459 "rpc worker-1145" 0x00007f7923b52ad3 in ?? ()
  62   LWP 11460 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  63   LWP 11461 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  64   LWP 11462 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  65   LWP 11463 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  66   LWP 11464 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  67   LWP 11465 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  68   LWP 11466 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  69   LWP 11467 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  70   LWP 11468 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  71   LWP 11469 "rpc worker-1146" 0x00007f7923b52ad3 in ?? ()
  72   LWP 11470 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  73   LWP 11471 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  74   LWP 11472 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  75   LWP 11473 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  76   LWP 11474 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  77   LWP 11475 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  78   LWP 11476 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  79   LWP 11477 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  80   LWP 11478 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  81   LWP 11479 "rpc worker-1147" 0x00007f7923b52ad3 in ?? ()
  82   LWP 11480 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  83   LWP 11481 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  84   LWP 11482 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  85   LWP 11483 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  86   LWP 11484 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  87   LWP 11485 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  88   LWP 11486 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  89   LWP 11487 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  90   LWP 11488 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  91   LWP 11489 "rpc worker-1148" 0x00007f7923b52ad3 in ?? ()
  92   LWP 11490 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  93   LWP 11491 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  94   LWP 11492 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  95   LWP 11493 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  96   LWP 11494 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  97   LWP 11495 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  98   LWP 11496 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  99   LWP 11497 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  100  LWP 11498 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  101  LWP 11499 "rpc worker-1149" 0x00007f7923b52ad3 in ?? ()
  102  LWP 11500 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  103  LWP 11501 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  104  LWP 11502 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  105  LWP 11503 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  106  LWP 11504 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  107  LWP 11505 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  108  LWP 11506 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  109  LWP 11507 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  110  LWP 11508 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  111  LWP 11509 "rpc worker-1150" 0x00007f7923b52ad3 in ?? ()
  112  LWP 11510 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  113  LWP 11511 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  114  LWP 11512 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  115  LWP 11513 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  116  LWP 11514 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  117  LWP 11515 "rpc worker-1151" 0x00007f7923b52ad3 in ?? ()
  118  LWP 11516 "diag-logger-115" 0x00007f7923b52fb9 in ?? ()
  119  LWP 11517 "result-tracker-" 0x00007f7923b52fb9 in ?? ()
  120  LWP 11518 "excess-log-dele" 0x00007f7923b52fb9 in ?? ()
  121  LWP 11519 "acceptor-11519" 0x00007f791ef590c7 in ?? ()
  122  LWP 11520 "heartbeat-11520" 0x00007f7923b52fb9 in ?? ()
  123  LWP 11521 "maintenance_sch" 0x00007f7923b52fb9 in ?? ()

Thread 123 (LWP 11521):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007b0100000000 in ?? ()
#2  0x0000000000000104 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f78d7db96c0 in ?? ()
#6  0x0000000000000208 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 11520):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 11519):
#0  0x00007f791ef590c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 11518):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f78d95bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007fff184b78a0 in ?? ()
#5  0x00007f78d95bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 11517):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000085352fb8 in ?? ()
#2  0x0000000000000041 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f78d9dbd800 in ?? ()
#6  0x0000000000000082 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 11516):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f791ce88008 in ?? ()
#2  0x0000000000000041 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f78da5be750 in ?? ()
#6  0x0000000000000082 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 11515):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 11514):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 11513):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 11512):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 11511):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 11510):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 11509):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 11508):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 11507):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 11506):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 11505):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 11504):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 11503):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 11502):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 11501):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 11500):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 11499):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 11498):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 11497):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 11496):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 11495):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 11494):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 11493):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 11492):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 11491):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 11490):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 11489):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 11488):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 11487):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 11486):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 11485):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 11484):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 11483):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 11482):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 11481):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 11480):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 11479):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 11478):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 11477):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 11476):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 11475):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000772 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f78ef7ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f78ef7ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 11474):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x00000000000007b1 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f78effbb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f78effbb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f7923b52770 in ?? ()
#10 0x00007f78effbb730 in ?? ()
#11 0x00007f78d39d4680 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 11473):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 11472):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 11471):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 11470):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 11469):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 11468):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 11467):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 11466):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 11465):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 11464):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 11463):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 11462):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 11461):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 11460):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 11459):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 11458):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 11457):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 11456):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 11455):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f78f9bbc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f78f9bbc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c8 in ?? ()
#9  0x00007f7923b52770 in ?? ()
#10 0x00007f78f9bbc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 11454):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 11453):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 11452):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 11451):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 11450):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 11449):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 11448):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 11447):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 11446):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 11445):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 11444):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 11443):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 11442):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 11441):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 11440):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 11439):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 11438):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 11437):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 11436):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 11435):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f7903fbe710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f7903fbe730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f7923b52770 in ?? ()
#10 0x00007f7903fbe730 in ?? ()
#11 0x00007f791bf9fc58 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 11434):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 11433):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 11432):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 11431):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 11430):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 11429):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 11428):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 11427):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 11426):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 11425):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 11424):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 11423):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 11422):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 11421):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 11420):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 11419):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 11418):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 11417):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 11416):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 11415):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000017a335f0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f790e592700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 11414):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f790ed939a8 in ?? ()
#2  0x000000000000000d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400037198 in ?? ()
#5  0x00007f790ed93840 in ?? ()
#6  0x000000000000001a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 11413):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f790f594410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 11412):
#0  0x00007f7923b52ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 11411):
#0  0x00007f791ef57a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 11410):
#0  0x00007f791ef57a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 11409):
#0  0x00007f791ef57a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 11408):
#0  0x00007f791ef57a47 in ?? ()
#1  0x00007b5800010108 in ?? ()
#2  0x003ce00001950e9c in ?? ()
#3  0x00007f79145be500 in ?? ()
#4  0x00007f79145bfb80 in ?? ()
#5  0x00007f79145be500 in ?? ()
#6  0x000000000000000d in ?? ()
#7  0x00007b5800000f00 in ?? ()
#8  0x0000000000488695 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f791c800000 in ?? ()
#10 0x0000000000488599 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007f79145bfb80 in ?? ()
#12 0x00007f79219b0069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007f79271311a0 in ?? ()
#15 0x00007b4c00002c10 in ?? ()
#16 0x00007b4c00002c18 in ?? ()
#17 0x00007f79145be7a0 in ?? ()
#18 0x00007b4400036a00 in ?? ()
#19 0x00007f79145becd0 in ?? ()
#20 0x0000000000000000 in ?? ()

Thread 9 (LWP 11405):
#0  0x00007f791ef4acb9 in ?? ()
#1  0x00007f7917dbcc10 in ?? ()
#2  0x00007b0400009010 in ?? ()
#3  0x00007f7917dbdb80 in ?? ()
#4  0x00007f7917dbcc10 in ?? ()
#5  0x00007b0400009010 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f791c88e000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f7917dbdb80 in ?? ()
#10 0x00007f792892fb28 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 11404):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 7 (LWP 11403):
#0  0x00007f7923b569e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 11395):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x00007f7918dbea40 in ?? ()
#2  0x000000000000014f in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000361d8 in ?? ()
#5  0x00007f7918dbe5d0 in ?? ()
#6  0x000000000000029e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 11394):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 11393):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 11392):
#0  0x00007f7923b52fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 11391):
#0  0x00007f791ef1a7a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 11390):
#0  0x00007f7923b56d50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f791e178cc0 in ?? ()
#4  0x00007f791e178cc0 in ?? ()
#5  0x00007fff184b76b0 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a94dc5 in ?? ()
#9  0x00007f791e178cc0 in ?? ()
#10 0x00007f792207ef0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250626 01:59:02.378561 10490 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID c1be94bb90e44876a3647fd124cf6adf and pid 11524
************************ BEGIN STACKS **************************
[New LWP 11525]
[New LWP 11526]
[New LWP 11527]
[New LWP 11528]
[New LWP 11529]
[New LWP 11536]
[New LWP 11537]
[New LWP 11538]
[New LWP 11541]
[New LWP 11542]
[New LWP 11543]
[New LWP 11544]
[New LWP 11545]
[New LWP 11546]
[New LWP 11547]
[New LWP 11548]
[New LWP 11549]
[New LWP 11550]
[New LWP 11551]
[New LWP 11552]
[New LWP 11553]
[New LWP 11554]
[New LWP 11555]
[New LWP 11556]
[New LWP 11557]
[New LWP 11558]
[New LWP 11559]
[New LWP 11560]
[New LWP 11561]
[New LWP 11562]
[New LWP 11563]
[New LWP 11564]
[New LWP 11565]
[New LWP 11566]
[New LWP 11567]
[New LWP 11568]
[New LWP 11569]
[New LWP 11570]
[New LWP 11571]
[New LWP 11572]
[New LWP 11573]
[New LWP 11574]
[New LWP 11575]
[New LWP 11576]
[New LWP 11577]
[New LWP 11578]
[New LWP 11579]
[New LWP 11580]
[New LWP 11581]
[New LWP 11582]
[New LWP 11583]
[New LWP 11584]
[New LWP 11585]
[New LWP 11586]
[New LWP 11587]
[New LWP 11588]
[New LWP 11589]
[New LWP 11590]
[New LWP 11591]
[New LWP 11592]
[New LWP 11593]
[New LWP 11594]
[New LWP 11595]
[New LWP 11596]
[New LWP 11597]
[New LWP 11598]
[New LWP 11599]
[New LWP 11600]
[New LWP 11601]
[New LWP 11602]
[New LWP 11603]
[New LWP 11604]
[New LWP 11605]
[New LWP 11606]
[New LWP 11607]
[New LWP 11608]
[New LWP 11609]
[New LWP 11610]
[New LWP 11611]
[New LWP 11612]
[New LWP 11613]
[New LWP 11614]
[New LWP 11615]
[New LWP 11616]
[New LWP 11617]
[New LWP 11618]
[New LWP 11619]
[New LWP 11620]
[New LWP 11621]
[New LWP 11622]
[New LWP 11623]
[New LWP 11624]
[New LWP 11625]
[New LWP 11626]
[New LWP 11627]
[New LWP 11628]
[New LWP 11629]
[New LWP 11630]
[New LWP 11631]
[New LWP 11632]
[New LWP 11633]
[New LWP 11634]
[New LWP 11635]
[New LWP 11636]
[New LWP 11637]
[New LWP 11638]
[New LWP 11639]
[New LWP 11640]
[New LWP 11641]
[New LWP 11642]
[New LWP 11643]
[New LWP 11644]
[New LWP 11645]
[New LWP 11646]
[New LWP 11647]
[New LWP 11648]
[New LWP 11649]
[New LWP 11650]
[New LWP 11651]
[New LWP 11652]
[New LWP 11653]
[New LWP 11654]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f4b9bb11d50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 11524 "kudu"  0x00007f4b9bb11d50 in ?? ()
  2    LWP 11525 "kudu"  0x00007f4b96ed57a0 in ?? ()
  3    LWP 11526 "kudu"  0x00007f4b9bb0dfb9 in ?? ()
  4    LWP 11527 "kudu"  0x00007f4b9bb0dfb9 in ?? ()
  5    LWP 11528 "kudu"  0x00007f4b9bb0dfb9 in ?? ()
  6    LWP 11529 "kernel-watcher-" 0x00007f4b9bb0dfb9 in ?? ()
  7    LWP 11536 "ntp client-1153" 0x00007f4b9bb119e2 in ?? ()
  8    LWP 11537 "file cache-evic" 0x00007f4b9bb0dfb9 in ?? ()
  9    LWP 11538 "sq_acceptor" 0x00007f4b96f05cb9 in ?? ()
  10   LWP 11541 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  11   LWP 11542 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  12   LWP 11543 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  13   LWP 11544 "rpc reactor-115" 0x00007f4b96f12a47 in ?? ()
  14   LWP 11545 "MaintenanceMgr " 0x00007f4b9bb0dad3 in ?? ()
  15   LWP 11546 "txn-status-mana" 0x00007f4b9bb0dfb9 in ?? ()
  16   LWP 11547 "collect_and_rem" 0x00007f4b9bb0dfb9 in ?? ()
  17   LWP 11548 "tc-session-exp-" 0x00007f4b9bb0dfb9 in ?? ()
  18   LWP 11549 "rpc worker-1154" 0x00007f4b9bb0dad3 in ?? ()
  19   LWP 11550 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  20   LWP 11551 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  21   LWP 11552 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  22   LWP 11553 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  23   LWP 11554 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  24   LWP 11555 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  25   LWP 11556 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  26   LWP 11557 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  27   LWP 11558 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  28   LWP 11559 "rpc worker-1155" 0x00007f4b9bb0dad3 in ?? ()
  29   LWP 11560 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  30   LWP 11561 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  31   LWP 11562 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  32   LWP 11563 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  33   LWP 11564 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  34   LWP 11565 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  35   LWP 11566 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  36   LWP 11567 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  37   LWP 11568 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  38   LWP 11569 "rpc worker-1156" 0x00007f4b9bb0dad3 in ?? ()
  39   LWP 11570 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  40   LWP 11571 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  41   LWP 11572 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  42   LWP 11573 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  43   LWP 11574 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  44   LWP 11575 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  45   LWP 11576 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  46   LWP 11577 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  47   LWP 11578 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  48   LWP 11579 "rpc worker-1157" 0x00007f4b9bb0dad3 in ?? ()
  49   LWP 11580 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  50   LWP 11581 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  51   LWP 11582 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  52   LWP 11583 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  53   LWP 11584 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  54   LWP 11585 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  55   LWP 11586 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  56   LWP 11587 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  57   LWP 11588 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  58   LWP 11589 "rpc worker-1158" 0x00007f4b9bb0dad3 in ?? ()
  59   LWP 11590 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  60   LWP 11591 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  61   LWP 11592 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  62   LWP 11593 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  63   LWP 11594 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  64   LWP 11595 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  65   LWP 11596 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  66   LWP 11597 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  67   LWP 11598 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  68   LWP 11599 "rpc worker-1159" 0x00007f4b9bb0dad3 in ?? ()
  69   LWP 11600 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  70   LWP 11601 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  71   LWP 11602 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  72   LWP 11603 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  73   LWP 11604 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  74   LWP 11605 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  75   LWP 11606 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  76   LWP 11607 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  77   LWP 11608 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  78   LWP 11609 "rpc worker-1160" 0x00007f4b9bb0dad3 in ?? ()
  79   LWP 11610 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  80   LWP 11611 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  81   LWP 11612 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  82   LWP 11613 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  83   LWP 11614 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  84   LWP 11615 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  85   LWP 11616 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  86   LWP 11617 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  87   LWP 11618 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  88   LWP 11619 "rpc worker-1161" 0x00007f4b9bb0dad3 in ?? ()
  89   LWP 11620 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  90   LWP 11621 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  91   LWP 11622 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  92   LWP 11623 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  93   LWP 11624 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  94   LWP 11625 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  95   LWP 11626 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  96   LWP 11627 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  97   LWP 11628 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  98   LWP 11629 "rpc worker-1162" 0x00007f4b9bb0dad3 in ?? ()
  99   LWP 11630 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  100  LWP 11631 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  101  LWP 11632 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  102  LWP 11633 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  103  LWP 11634 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  104  LWP 11635 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  105  LWP 11636 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  106  LWP 11637 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  107  LWP 11638 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  108  LWP 11639 "rpc worker-1163" 0x00007f4b9bb0dad3 in ?? ()
  109  LWP 11640 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  110  LWP 11641 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  111  LWP 11642 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  112  LWP 11643 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  113  LWP 11644 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  114  LWP 11645 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  115  LWP 11646 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  116  LWP 11647 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  117  LWP 11648 "rpc worker-1164" 0x00007f4b9bb0dad3 in ?? ()
  118  LWP 11649 "diag-logger-116" 0x00007f4b9bb0dfb9 in ?? ()
  119  LWP 11650 "result-tracker-" 0x00007f4b9bb0dfb9 in ?? ()
  120  LWP 11651 "excess-log-dele" 0x00007f4b9bb0dfb9 in ?? ()
  121  LWP 11652 "acceptor-11652" 0x00007f4b96f140c7 in ?? ()
  122  LWP 11653 "heartbeat-11653" 0x00007f4b9bb0dfb9 in ?? ()
  123  LWP 11654 "maintenance_sch" 0x00007f4b9bb0dfb9 in ?? ()

Thread 123 (LWP 11654):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007b0100000000 in ?? ()
#2  0x0000000000000101 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f4b4fdb96c0 in ?? ()
#6  0x0000000000000202 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 11653):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 11652):
#0  0x00007f4b96f140c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 11651):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b515bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffe91554e90 in ?? ()
#5  0x00007f4b515bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 11650):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000085352fb8 in ?? ()
#2  0x0000000000000040 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f4b51dbd800 in ?? ()
#6  0x0000000000000080 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 11649):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b94e88008 in ?? ()
#2  0x000000000000003c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f4b525be750 in ?? ()
#6  0x0000000000000078 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 11648):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 11647):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 11646):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 11645):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 11644):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 11643):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 11642):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 11641):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 11640):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 11639):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 11638):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 11637):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 11636):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 11635):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 11634):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 11633):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 11632):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 11631):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 11630):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 11629):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 11628):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 11627):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 11626):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 11625):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 11624):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 11623):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 11622):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 11621):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 11620):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 11619):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 11618):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 11617):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 11616):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 11615):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 11614):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 11613):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 11612):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 11611):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 11610):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 11609):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 11608):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x000000000000085c in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f4b677ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b677ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 11607):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x00000000000005d1 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f4b67fbb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b67fbb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 75 (LWP 11606):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x00000000000000fb in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010d7dc in ?? ()
#4  0x00007f4b687bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b687bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f4b9bb0d770 in ?? ()
#10 0x00007f4b687bc730 in ?? ()
#11 0x00007f4b4cb05e78 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 74 (LWP 11605):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 11604):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 11603):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 11602):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 11601):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 11600):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 11599):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 11598):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 11597):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 11596):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 11595):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 11594):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 11593):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 11592):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 11591):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 11590):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 11589):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 11588):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f4b71bbc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b71bbc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 56 (LWP 11587):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 11586):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 11585):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 11584):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 11583):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 11582):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 11581):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 11580):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 11579):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 11578):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 11577):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 11576):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 11575):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 11574):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 11573):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 11572):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 11571):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 11570):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 11569):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 11568):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f4b7bfbe710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f4b7bfbe730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f4b9bb0d770 in ?? ()
#10 0x00007f4b7bfbe730 in ?? ()
#11 0x00007f4b93f69c50 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 11567):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 11566):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 11565):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 11564):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 11563):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 11562):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 11561):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 11560):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 11559):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 11558):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 11557):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 11556):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 11555):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 11554):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 11553):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 11552):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 11551):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 11550):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 11549):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 11548):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000017a335f0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f4b86592700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 11547):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b86d939a8 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400037198 in ?? ()
#5  0x00007f4b86d93840 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 11546):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f4b87594410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 11545):
#0  0x00007f4b9bb0dad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 11544):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 11543):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 11542):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 11541):
#0  0x00007f4b96f12a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 11538):
#0  0x00007f4b96f05cb9 in ?? ()
#1  0x00007f4b8fdbcc10 in ?? ()
#2  0x00007b040000a860 in ?? ()
#3  0x00007f4b8fdbdb80 in ?? ()
#4  0x00007f4b8fdbcc10 in ?? ()
#5  0x00007b040000a860 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f4b9483a000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f4b8fdbdb80 in ?? ()
#10 0x00007f4ba08eab28 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 11537):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000600000000000 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400034018 in ?? ()
#5  0x00007f4b8f5bb7f0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 7 (LWP 11536):
#0  0x00007f4b9bb119e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 11529):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x00007f4b90dbea40 in ?? ()
#2  0x0000000000000147 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000361d8 in ?? ()
#5  0x00007f4b90dbe5d0 in ?? ()
#6  0x000000000000028e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 11528):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 11527):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 11526):
#0  0x00007f4b9bb0dfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 11525):
#0  0x00007f4b96ed57a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 11524):
#0  0x00007f4b9bb11d50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f4b96133cc0 in ?? ()
#4  0x00007f4b96133cc0 in ?? ()
#5  0x00007ffe91554ca0 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a9adcd in ?? ()
#9  0x00007f4b96133cc0 in ?? ()
#10 0x00007f4b9a039f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250626 01:59:03.511559 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11257
I20250626 01:59:03.570781 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11390
I20250626 01:59:03.631290 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11524
I20250626 01:59:03.693063 10490 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskjTgcG5/build/tsan/bin/kudu with pid 11165
2025-06-26T01:59:03Z chronyd exiting
I20250626 01:59:03.763218 10490 test_util.cc:183] -----------------------------------------------
I20250626 01:59:03.763502 10490 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskjTgcG5/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750903044745843-10490-0
[  FAILED  ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate (74005 ms)
[----------] 4 tests from TabletCopyITest (98830 ms total)

[----------] 1 test from FaultFlags/BadTabletCopyITest
[ RUN      ] FaultFlags/BadTabletCopyITest.TestBadCopy/1
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:1510: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[  SKIPPED ] FaultFlags/BadTabletCopyITest.TestBadCopy/1 (9 ms)
[----------] 1 test from FaultFlags/BadTabletCopyITest (9 ms total)

[----------] Global test environment tear-down
[==========] 5 tests from 2 test suites ran. (98840 ms total)
[  PASSED  ] 1 test.
[  SKIPPED ] 3 tests, listed below:
[  SKIPPED ] TabletCopyITest.TestRejectRogueLeader
[  SKIPPED ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest
[  SKIPPED ] FaultFlags/BadTabletCopyITest.TestBadCopy/1
[  FAILED  ] 1 test, listed below:
[  FAILED  ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate

 1 FAILED TEST