Diagnosed failure

TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate: /home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:2151: Failure
Failed
Bad status: Timed out: Timed out waiting for number of WAL segments on tablet 4c32bc3832404ac68aa4aac69ef1184d on TS 0 to be 6. Found 5
I20250624 19:59:10.674275  3133 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20250624 19:59:10.674808  3133 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 9bd87754848d4ac7a22df1de055c1cef and pid 3899
************************ BEGIN STACKS **************************
W20250624 19:59:11.495973  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000c2b20 after lost signal to thread 3900
W20250624 19:59:11.497035  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000b2b00 after lost signal to thread 4027
[New LWP 3900]
[New LWP 3901]
[New LWP 3902]
[New LWP 3903]
[New LWP 3904]
[New LWP 3911]
[New LWP 3912]
[New LWP 3913]
[New LWP 3916]
[New LWP 3917]
[New LWP 3918]
[New LWP 3919]
[New LWP 3920]
[New LWP 3921]
[New LWP 3922]
[New LWP 3923]
[New LWP 3924]
[New LWP 3925]
[New LWP 3926]
[New LWP 3927]
[New LWP 3928]
[New LWP 3929]
[New LWP 3930]
[New LWP 3931]
[New LWP 3932]
[New LWP 3933]
[New LWP 3934]
[New LWP 3935]
[New LWP 3936]
[New LWP 3937]
[New LWP 3938]
[New LWP 3939]
[New LWP 3940]
[New LWP 3941]
[New LWP 3942]
[New LWP 3943]
[New LWP 3944]
[New LWP 3945]
[New LWP 3946]
[New LWP 3947]
[New LWP 3948]
[New LWP 3949]
[New LWP 3950]
[New LWP 3951]
[New LWP 3952]
[New LWP 3953]
[New LWP 3954]
[New LWP 3955]
[New LWP 3956]
[New LWP 3957]
[New LWP 3958]
[New LWP 3959]
[New LWP 3960]
[New LWP 3961]
[New LWP 3962]
[New LWP 3963]
[New LWP 3964]
[New LWP 3965]
[New LWP 3966]
[New LWP 3967]
[New LWP 3968]
[New LWP 3969]
[New LWP 3970]
[New LWP 3971]
[New LWP 3972]
[New LWP 3973]
[New LWP 3974]
[New LWP 3975]
[New LWP 3976]
[New LWP 3977]
[New LWP 3978]
[New LWP 3979]
[New LWP 3980]
[New LWP 3981]
[New LWP 3982]
[New LWP 3983]
[New LWP 3984]
[New LWP 3985]
[New LWP 3986]
[New LWP 3987]
[New LWP 3988]
[New LWP 3989]
[New LWP 3990]
[New LWP 3991]
[New LWP 3992]
[New LWP 3993]
[New LWP 3994]
[New LWP 3995]
[New LWP 3996]
[New LWP 3997]
[New LWP 3998]
[New LWP 3999]
[New LWP 4000]
[New LWP 4001]
[New LWP 4002]
[New LWP 4003]
[New LWP 4004]
[New LWP 4005]
[New LWP 4006]
[New LWP 4007]
[New LWP 4008]
[New LWP 4009]
[New LWP 4010]
[New LWP 4011]
[New LWP 4012]
[New LWP 4013]
[New LWP 4014]
[New LWP 4015]
[New LWP 4016]
[New LWP 4017]
[New LWP 4018]
[New LWP 4019]
[New LWP 4020]
[New LWP 4021]
[New LWP 4022]
[New LWP 4023]
[New LWP 4024]
[New LWP 4025]
[New LWP 4026]
[New LWP 4027]
[New LWP 4028]
[New LWP 4029]
[New LWP 4337]
[New LWP 4519]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007fee48c8fd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 3899 "kudu"   0x00007fee48c8fd50 in ?? ()
  2    LWP 3900 "kudu"   0x00007fee440537a0 in ?? ()
  3    LWP 3901 "kudu"   0x00007fee48c8bfb9 in ?? ()
  4    LWP 3902 "kudu"   0x00007fee48c8bfb9 in ?? ()
  5    LWP 3903 "kudu"   0x00007fee48c8bfb9 in ?? ()
  6    LWP 3904 "kernel-watcher-" 0x00007fee48c8bfb9 in ?? ()
  7    LWP 3911 "ntp client-3911" 0x00007fee48c8f9e2 in ?? ()
  8    LWP 3912 "file cache-evic" 0x00007fee48c8bfb9 in ?? ()
  9    LWP 3913 "sq_acceptor" 0x00007fee44083cb9 in ?? ()
  10   LWP 3916 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  11   LWP 3917 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  12   LWP 3918 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  13   LWP 3919 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  14   LWP 3920 "MaintenanceMgr " 0x00007fee48c8bad3 in ?? ()
  15   LWP 3921 "txn-status-mana" 0x00007fee48c8bfb9 in ?? ()
  16   LWP 3922 "collect_and_rem" 0x00007fee48c8bfb9 in ?? ()
  17   LWP 3923 "tc-session-exp-" 0x00007fee48c8bfb9 in ?? ()
  18   LWP 3924 "rpc worker-3924" 0x00007fee48c8bad3 in ?? ()
  19   LWP 3925 "rpc worker-3925" 0x00007fee48c8bad3 in ?? ()
  20   LWP 3926 "rpc worker-3926" 0x00007fee48c8bad3 in ?? ()
  21   LWP 3927 "rpc worker-3927" 0x00007fee48c8bad3 in ?? ()
  22   LWP 3928 "rpc worker-3928" 0x00007fee48c8bad3 in ?? ()
  23   LWP 3929 "rpc worker-3929" 0x00007fee48c8bad3 in ?? ()
  24   LWP 3930 "rpc worker-3930" 0x00007fee48c8bad3 in ?? ()
  25   LWP 3931 "rpc worker-3931" 0x00007fee48c8bad3 in ?? ()
  26   LWP 3932 "rpc worker-3932" 0x00007fee48c8bad3 in ?? ()
  27   LWP 3933 "rpc worker-3933" 0x00007fee48c8bad3 in ?? ()
  28   LWP 3934 "rpc worker-3934" 0x00007fee48c8bad3 in ?? ()
  29   LWP 3935 "rpc worker-3935" 0x00007fee48c8bad3 in ?? ()
  30   LWP 3936 "rpc worker-3936" 0x00007fee48c8bad3 in ?? ()
  31   LWP 3937 "rpc worker-3937" 0x00007fee48c8bad3 in ?? ()
  32   LWP 3938 "rpc worker-3938" 0x00007fee48c8bad3 in ?? ()
  33   LWP 3939 "rpc worker-3939" 0x00007fee48c8bad3 in ?? ()
  34   LWP 3940 "rpc worker-3940" 0x00007fee48c8bad3 in ?? ()
  35   LWP 3941 "rpc worker-3941" 0x00007fee48c8bad3 in ?? ()
  36   LWP 3942 "rpc worker-3942" 0x00007fee48c8bad3 in ?? ()
  37   LWP 3943 "rpc worker-3943" 0x00007fee48c8bad3 in ?? ()
  38   LWP 3944 "rpc worker-3944" 0x00007fee48c8bad3 in ?? ()
  39   LWP 3945 "rpc worker-3945" 0x00007fee48c8bad3 in ?? ()
  40   LWP 3946 "rpc worker-3946" 0x00007fee48c8bad3 in ?? ()
  41   LWP 3947 "rpc worker-3947" 0x00007fee48c8bad3 in ?? ()
  42   LWP 3948 "rpc worker-3948" 0x00007fee48c8bad3 in ?? ()
  43   LWP 3949 "rpc worker-3949" 0x00007fee48c8bad3 in ?? ()
  44   LWP 3950 "rpc worker-3950" 0x00007fee48c8bad3 in ?? ()
  45   LWP 3951 "rpc worker-3951" 0x00007fee48c8bad3 in ?? ()
  46   LWP 3952 "rpc worker-3952" 0x00007fee48c8bad3 in ?? ()
  47   LWP 3953 "rpc worker-3953" 0x00007fee48c8bad3 in ?? ()
  48   LWP 3954 "rpc worker-3954" 0x00007fee48c8bad3 in ?? ()
  49   LWP 3955 "rpc worker-3955" 0x00007fee48c8bad3 in ?? ()
  50   LWP 3956 "rpc worker-3956" 0x00007fee48c8bad3 in ?? ()
  51   LWP 3957 "rpc worker-3957" 0x00007fee48c8bad3 in ?? ()
  52   LWP 3958 "rpc worker-3958" 0x00007fee48c8bad3 in ?? ()
  53   LWP 3959 "rpc worker-3959" 0x00007fee48c8bad3 in ?? ()
  54   LWP 3960 "rpc worker-3960" 0x00007fee48c8bad3 in ?? ()
  55   LWP 3961 "rpc worker-3961" 0x00007fee48c8bad3 in ?? ()
  56   LWP 3962 "rpc worker-3962" 0x00007fee48c8bad3 in ?? ()
  57   LWP 3963 "rpc worker-3963" 0x00007fee48c8bad3 in ?? ()
  58   LWP 3964 "rpc worker-3964" 0x00007fee48c8bad3 in ?? ()
  59   LWP 3965 "rpc worker-3965" 0x00007fee48c8bad3 in ?? ()
  60   LWP 3966 "rpc worker-3966" 0x00007fee48c8bad3 in ?? ()
  61   LWP 3967 "rpc worker-3967" 0x00007fee48c8bad3 in ?? ()
  62   LWP 3968 "rpc worker-3968" 0x00007fee48c8bad3 in ?? ()
  63   LWP 3969 "rpc worker-3969" 0x00007fee48c8bad3 in ?? ()
  64   LWP 3970 "rpc worker-3970" 0x00007fee48c8bad3 in ?? ()
  65   LWP 3971 "rpc worker-3971" 0x00007fee48c8bad3 in ?? ()
  66   LWP 3972 "rpc worker-3972" 0x00007fee48c8bad3 in ?? ()
  67   LWP 3973 "rpc worker-3973" 0x00007fee48c8bad3 in ?? ()
  68   LWP 3974 "rpc worker-3974" 0x00007fee48c8bad3 in ?? ()
  69   LWP 3975 "rpc worker-3975" 0x00007fee48c8bad3 in ?? ()
  70   LWP 3976 "rpc worker-3976" 0x00007fee48c8bad3 in ?? ()
  71   LWP 3977 "rpc worker-3977" 0x00007fee48c8bad3 in ?? ()
  72   LWP 3978 "rpc worker-3978" 0x00007fee48c8bad3 in ?? ()
  73   LWP 3979 "rpc worker-3979" 0x00007fee48c8bad3 in ?? ()
  74   LWP 3980 "rpc worker-3980" 0x00007fee48c8bad3 in ?? ()
  75   LWP 3981 "rpc worker-3981" 0x00007fee48c8bad3 in ?? ()
  76   LWP 3982 "rpc worker-3982" 0x00007fee48c8bad3 in ?? ()
  77   LWP 3983 "rpc worker-3983" 0x00007fee48c8bad3 in ?? ()
  78   LWP 3984 "rpc worker-3984" 0x00007fee48c8bad3 in ?? ()
  79   LWP 3985 "rpc worker-3985" 0x00007fee48c8bad3 in ?? ()
  80   LWP 3986 "rpc worker-3986" 0x00007fee48c8bad3 in ?? ()
  81   LWP 3987 "rpc worker-3987" 0x00007fee48c8bad3 in ?? ()
  82   LWP 3988 "rpc worker-3988" 0x00007fee48c8bad3 in ?? ()
  83   LWP 3989 "rpc worker-3989" 0x00007fee48c8bad3 in ?? ()
  84   LWP 3990 "rpc worker-3990" 0x00007fee48c8bad3 in ?? ()
  85   LWP 3991 "rpc worker-3991" 0x00007fee48c8bad3 in ?? ()
  86   LWP 3992 "rpc worker-3992" 0x00007fee48c8bad3 in ?? ()
  87   LWP 3993 "rpc worker-3993" 0x00007fee48c8bad3 in ?? ()
  88   LWP 3994 "rpc worker-3994" 0x00007fee48c8bad3 in ?? ()
  89   LWP 3995 "rpc worker-3995" 0x00007fee48c8bad3 in ?? ()
  90   LWP 3996 "rpc worker-3996" 0x00007fee48c8bad3 in ?? ()
  91   LWP 3997 "rpc worker-3997" 0x00007fee48c8bad3 in ?? ()
  92   LWP 3998 "rpc worker-3998" 0x00007fee48c8bad3 in ?? ()
  93   LWP 3999 "rpc worker-3999" 0x00007fee48c8bad3 in ?? ()
  94   LWP 4000 "rpc worker-4000" 0x00007fee48c8bad3 in ?? ()
  95   LWP 4001 "rpc worker-4001" 0x00007fee48c8bad3 in ?? ()
  96   LWP 4002 "rpc worker-4002" 0x00007fee48c8bad3 in ?? ()
  97   LWP 4003 "rpc worker-4003" 0x00007fee48c8bad3 in ?? ()
  98   LWP 4004 "rpc worker-4004" 0x00007fee48c8bad3 in ?? ()
  99   LWP 4005 "rpc worker-4005" 0x00007fee48c8bad3 in ?? ()
  100  LWP 4006 "rpc worker-4006" 0x00007fee48c8bad3 in ?? ()
  101  LWP 4007 "rpc worker-4007" 0x00007fee48c8bad3 in ?? ()
  102  LWP 4008 "rpc worker-4008" 0x00007fee48c8bad3 in ?? ()
  103  LWP 4009 "rpc worker-4009" 0x00007fee48c8bad3 in ?? ()
  104  LWP 4010 "rpc worker-4010" 0x00007fee48c8bad3 in ?? ()
  105  LWP 4011 "rpc worker-4011" 0x00007fee48c8bad3 in ?? ()
  106  LWP 4012 "rpc worker-4012" 0x00007fee48c8bad3 in ?? ()
  107  LWP 4013 "rpc worker-4013" 0x00007fee48c8bad3 in ?? ()
  108  LWP 4014 "rpc worker-4014" 0x00007fee48c8bad3 in ?? ()
  109  LWP 4015 "rpc worker-4015" 0x00007fee48c8bad3 in ?? ()
  110  LWP 4016 "rpc worker-4016" 0x00007fee48c8bad3 in ?? ()
  111  LWP 4017 "rpc worker-4017" 0x00007fee48c8bad3 in ?? ()
  112  LWP 4018 "rpc worker-4018" 0x00007fee48c8bad3 in ?? ()
  113  LWP 4019 "rpc worker-4019" 0x00007fee48c8bad3 in ?? ()
  114  LWP 4020 "rpc worker-4020" 0x00007fee48c8bad3 in ?? ()
  115  LWP 4021 "rpc worker-4021" 0x00007fee48c8bad3 in ?? ()
  116  LWP 4022 "rpc worker-4022" 0x00007fee48c8bad3 in ?? ()
  117  LWP 4023 "rpc worker-4023" 0x00007fee48c8bad3 in ?? ()
  118  LWP 4024 "diag-logger-402" 0x00007fee48c9008f in ?? ()
  119  LWP 4025 "result-tracker-" 0x00007fee48c8bfb9 in ?? ()
  120  LWP 4026 "excess-log-dele" 0x00007fee48c8bfb9 in ?? ()
  121  LWP 4027 "acceptor-4027" 0x00007fee440920c7 in ?? ()
  122  LWP 4028 "heartbeat-4028" 0x00007fee48c8bfb9 in ?? ()
  123  LWP 4029 "maintenance_sch" 0x00007fee48c8bfb9 in ?? ()
  124  LWP 4337 "wal-append [wor" 0x00007fee48c8bfb9 in ?? ()
  125  LWP 4519 "raft [worker]-4" 0x00007fee48c8bfb9 in ?? ()

Thread 125 (LWP 4519):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 124 (LWP 4337):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007b10000563f0 in ?? ()
#2  0x00000000000010e9 in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007b640006001c in ?? ()
#5  0x00007fedfa4bd440 in ?? ()
#6  0x00000000000021d3 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fedfa4bd460 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fee4210e008 in ?? ()
#12 0x00007fed00000001 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 123 (LWP 4029):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007f0100000000 in ?? ()
#2  0x0000000000000103 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007fedfceb96c0 in ?? ()
#6  0x0000000000000206 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 4028):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 4027):
#0  0x00007fee440920c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 4026):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007fedfe6bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffebb936b00 in ?? ()
#5  0x00007fedfe6bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 4025):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000085352f88 in ?? ()
#2  0x0000000000000041 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007fedfeebd800 in ?? ()
#6  0x0000000000000082 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 4024):
#0  0x00007fee48c9008f in ?? ()
#1  0x000000000000000a in ?? ()
#2  0x01ece0000f5c9e2a in ?? ()
#3  0x00007fedff6bdb30 in ?? ()
#4  0x00007fedff6bfb80 in ?? ()
#5  0x00007fedff6bdb30 in ?? ()
#6  0x0000000000000029 in ?? ()
#7  0x00007fedff6bdfc8 in ?? ()
#8  0x000000000046ddd7 in __sanitizer::internal_alloc_placeholder ()
#9  0x000000000046dd49 in __sanitizer::internal_alloc_placeholder ()
#10 0x00007fedff6bdeb9 in ?? ()
#11 0x00007fedff6bfb80 in ?? ()
#12 0x00007fee453fb4b3 in ?? ()
#13 0x00007fedff6b0000 in ?? ()
#14 0x00007fee453fb2b2 in ?? ()
#15 0x00007fedff6b0000 in ?? ()
#16 0x00007fedff6bde4b in ?? ()
#17 0x00007fedff6be0c0 in ?? ()
#18 0x0000000000000025 in ?? ()
#19 0x00007fedff6bde4a in ?? ()
#20 0x00000fb92c330000 in ?? ()
#21 0x0000000000000040 in ?? ()
#22 0x00007fee453fb4b3 in ?? ()
#23 0x3062306332396266 in ?? ()
#24 0x323962662d303030 in ?? ()
#25 0x2030303030333163 in ?? ()
#26 0x30303020702d2d72 in ?? ()
#27 0x3030203030303030 in ?? ()
#28 0x363437372031343a in ?? ()
#29 0x2020202020203031 in ?? ()
#30 0x2020202020202020 in ?? ()
#31 0x2020202020202020 in ?? ()
#32 0x69642f706d742f20 in ?? ()
#33 0x2d747365742d7473 in ?? ()
#34 0x355737336b736174 in ?? ()
#35 0x2d747365742f4b68 in ?? ()
#36 0x6e6173742f706d74 in ?? ()
#37 0x2e617461646f722e in ?? ()
#38 0x6564282039393833 in ?? ()
#39 0x660029646574656c in ?? ()
#40 0x3030333163323962 in ?? ()
#41 0x63323962662d3030 in ?? ()
#42 0x7220303030306231 in ?? ()
#43 0x3030303020702d2d in ?? ()
#44 0x3a30302030303030 in ?? ()
#45 0x3136343737203134 in ?? ()
#46 0x2020202020202030 in ?? ()
#47 0x2020202020202020 in ?? ()
#48 0x2020202020202020 in ?? ()
#49 0x7369642f706d742f in ?? ()
#50 0x742d747365742d74 in ?? ()
#51 0x68355737336b7361 in ?? ()
#52 0x742d747365742f4b in ?? ()
#53 0x2e6e6173742f706d in ?? ()
#54 0x332e617461646f72 in ?? ()
#55 0x6c65642820393938 in ?? ()
#56 0x6266002964657465 in ?? ()
#57 0x3030306231633239 in ?? ()
#58 0x3263323962662d30 in ?? ()
#59 0x2d72203030303033 in ?? ()
#60 0x303030303020702d in ?? ()
#61 0x343a303020303030 in ?? ()
#62 0x3031363437372031 in ?? ()
#63 0x2020202020202020 in ?? ()
#64 0x2020202020202020 in ?? ()
#65 0x2f20202020202020 in ?? ()
#66 0x747369642f706d74 in ?? ()
#67 0x61742d747365742d in ?? ()
#68 0x4b68355737336b73 in ?? ()
#69 0x6d742d747365742f in ?? ()
#70 0x722e6e6173742f70 in ?? ()
#71 0x38332e617461646f in ?? ()
#72 0x656c656428203939 in ?? ()
#73 0x3962660029646574 in ?? ()
#74 0x3030303033326332 in ?? ()
#75 0x623263323962662d in ?? ()
#76 0x2d2d722030303030 in ?? ()
#77 0x3030303030302070 in ?? ()
#78 0x31343a3030203030 in ?? ()
#79 0x2030313634373720 in ?? ()
#80 0x2020202020202020 in ?? ()
#81 0x2020202020202020 in ?? ()
#82 0x742f202020202020 in ?? ()
#83 0x2d747369642f706d in ?? ()
#84 0x7361742d74736574 in ?? ()
#85 0x2f4b68355737336b in ?? ()
#86 0x706d742d74736574 in ?? ()
#87 0x6f722e6e6173742f in ?? ()
#88 0x3938332e61746164 in ?? ()
#89 0x74656c6564282039 in ?? ()
#90 0x3239626600296465 in ?? ()
#91 0x2d30303030623263 in ?? ()
#92 0x3033336332396266 in ?? ()
#93 0x702d2d7220303030 in ?? ()
#94 0x3030303030303020 in ?? ()
#95 0x2031343a30302030 in ?? ()
#96 0x2020303136343737 in ?? ()
#97 0x2020202020202020 in ?? ()
#98 0x2020202020202020 in ?? ()
#99 0x6d742f2020202020 in ?? ()
#100 0x742d747369642f70 in ?? ()
#101 0x6b7361742d747365 in ?? ()
#102 0x742f4b6835573733 in ?? ()
#103 0x2f706d742d747365 in ?? ()
#104 0x646f722e6e617374 in ?? ()
#105 0x393938332e617461 in ?? ()
#106 0x6574656c65642820 in ?? ()
#107 0x6332396266002964 in ?? ()
#108 0x662d303030303333 in ?? ()
#109 0x3030623363323962 in ?? ()
#110 0x20702d2d72203030 in ?? ()
#111 0x3030303030303030 in ?? ()
#112 0x372031343a303020 in ?? ()
#113 0x2020203031363437 in ?? ()
#114 0x2020202020202020 in ?? ()
#115 0x2020202020202020 in ?? ()
#116 0x706d742f20202020 in ?? ()
#117 0x65742d747369642f in ?? ()
#118 0x336b7361742d7473 in ?? ()
#119 0x65742f4b68355737 in ?? ()
#120 0x742f706d742d7473 in ?? ()
#121 0x61646f722e6e6173 in ?? ()
#122 0x20393938332e6174 in ?? ()
#123 0x646574656c656428 in ?? ()
#124 0x3363323962660029 in ?? ()
#125 0x62662d3030303062 in ?? ()
#126 0x3030303334633239 in ?? ()
#127 0x3020702d2d722030 in ?? ()
#128 0x2030303030303030 in ?? ()
#129 0x37372031343a3030 in ?? ()
#130 0x2020202030313634 in ?? ()
#131 0x2020202020202020 in ?? ()
#132 0x2020202020202020 in ?? ()
#133 0x2f706d742f202020 in ?? ()
#134 0x7365742d74736964 in ?? ()
#135 0x37336b7361742d74 in ?? ()
#136 0x7365742f4b683557 in ?? ()
#137 0x73742f706d742d74 in ?? ()
#138 0x7461646f722e6e61 in ?? ()
#139 0x2820393938332e61 in ?? ()
#140 0x29646574656c6564 in ?? ()
#141 0x333463323962660a in ?? ()
#142 0x3962662d30303030 in ?? ()
#143 0x3030303062346332 in ?? ()
#144 0x303020702d2d7220 in ?? ()
#145 0x3020303030303030 in ?? ()
#146 0x3437372031343a30 in ?? ()
#147 0x2020202020303136 in ?? ()
#148 0x2020202020202020 in ?? ()
#149 0x2020202020202020 in ?? ()
#150 0x642f706d742f2020 in ?? ()
#151 0x00007fedff6be020 in ?? ()
#152 0x000000000052f42c in __sanitizer::theDepot ()
#153 0x00000000004e5308 in __sanitizer::theDepot ()
#154 0xaf8a040000000000 in ?? ()
#155 0x00007fedff6be282 in ?? ()
#156 0x00000000004dfcfb in __sanitizer::theDepot ()
#157 0x01ec84000f5c07d6 in ?? ()
#158 0xffffffffffffffff in ?? ()
#159 0xffffffffffffffff in ?? ()
#160 0xffffffffffffffff in ?? ()
#161 0xffffffffffffffff in ?? ()
#162 0xffffffffffffffff in ?? ()
#163 0xffffffffffffffff in ?? ()
#164 0xffffffffffffffff in ?? ()
#165 0xffffffffffffffff in ?? ()
#166 0x00000000000003ff in ?? ()
#167 0x00007fedff6be261 in ?? ()
#168 0x00000fb92c3b0000 in ?? ()
#169 0x00007fedff6bde4f in ?? ()
#170 0x0000000000000400 in ?? ()
#171 0x00007fee45400021 in ?? ()
#172 0x00007fee4716d574 in ?? ()
#173 0x00007fedff6bde4b in ?? ()
#174 0x000000000046924d in __sanitizer::internal_alloc_placeholder ()
#175 0x00007fedff6be1d8 in ?? ()
#176 0x0000000000000000 in ?? ()

Thread 117 (LWP 4023):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 4022):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 4021):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 4020):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 4019):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 4018):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 4017):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 4016):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 4015):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 4014):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 4013):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 4012):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 4011):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 4010):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 4009):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 4008):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 4007):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 4006):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 4005):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 4004):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 4003):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 4002):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 4001):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 4000):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 3999):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 3998):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 3997):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 3996):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 3995):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 3994):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 3993):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 3992):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 3991):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 3990):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 3989):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 3988):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 3987):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 3986):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 3985):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 3984):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 3983):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000006 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007fee148ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee148ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 3982):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 3981):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 3980):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 3979):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 3978):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 3977):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 3976):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 3975):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 3974):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 3973):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 3972):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 3971):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 3970):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 3969):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 3968):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 3967):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 3966):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 3965):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 3964):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 3963):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007fee1ecbc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee1ecbc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c2 in ?? ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee1ecbc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 3962):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 3961):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 3960):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 3959):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 3958):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 3957):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 3956):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 3955):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 3954):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 3953):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 3952):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 3951):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 3950):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 3949):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 3948):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 3947):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 3946):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 3945):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 3944):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 3943):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000003 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007fee290be710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee290be730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee290be730 in ?? ()
#11 0x00007fee4109cc60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 3942):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x00000000000004cb in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005d7fc in ?? ()
#4  0x00007fee29ab6710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee29ab6730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee29ab6730 in ?? ()
#11 0x00007fedfc4afca0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 3941):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x000000000000032b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b2400058ffc in ?? ()
#4  0x00007fee2a2b7710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2a2b7730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2a2b7730 in ?? ()
#11 0x00007fedfaeacaa0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 3940):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x00000000000000a2 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24000547f8 in ?? ()
#4  0x00007fee2aab8710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2aab8730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 33 (LWP 3939):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240004fffc in ?? ()
#4  0x00007fee2b2b9710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2b2b9730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2b2b9730 in ?? ()
#11 0x00007fee415ffc60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 32 (LWP 3938):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240004900c in ?? ()
#4  0x00007fee2baba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2baba730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2baba730 in ?? ()
#11 0x00007fee415f7c60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 31 (LWP 3937):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240004480c in ?? ()
#4  0x00007fee2c2bb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2c2bb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2c2bb730 in ?? ()
#11 0x00007fee415efc60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 30 (LWP 3936):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 3935):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 3934):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 3933):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 3932):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 3931):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 3930):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 3929):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 3928):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 3927):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 3926):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 3925):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 3924):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 3923):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000017a335c0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007fee3368e700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 3922):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007fee33e8f9a8 in ?? ()
#2  0x000000000000000d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000372d8 in ?? ()
#5  0x00007fee33e8f840 in ?? ()
#6  0x000000000000001a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 3921):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007fee34690410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 3920):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 3919):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 3918):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 3917):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 3916):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 3913):
#0  0x00007fee44083cb9 in ?? ()
#1  0x00007fee3cebcc10 in ?? ()
#2  0x00007b0400009010 in ?? ()
#3  0x00007fee3cebdb80 in ?? ()
#4  0x00007fee3cebcc10 in ?? ()
#5  0x00007b0400009010 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007fee419c2000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007fee3cebdb80 in ?? ()
#10 0x00007fee4da68908 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 3912):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000600000000000 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400034018 in ?? ()
#5  0x00007fee3c6bb7f0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 7 (LWP 3911):
#0  0x00007fee48c8f9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 3904):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007fee3debea40 in ?? ()
#2  0x000000000000014a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400035b98 in ?? ()
#5  0x00007fee3debe5d0 in ?? ()
#6  0x0000000000000294 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 3903):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 3902):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 3901):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 3900):
#0  0x00007fee440537a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 3899):
#0  0x00007fee48c8fd50 in ?? ()
#1  0x00007ffebb936970 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007fee432b1cc0 in ?? ()
#4  0x00007fee432b1cc0 in ?? ()
#5  0x00007ffebb936910 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0xffffffff00aa1299 in ?? ()
#9  0x00007fee432b1cc0 in ?? ()
#10 0x00007fee471b7f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 19:59:11.730051  3133 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID 68b8a31f84ff465b97d2cb3c2fa3a021 and pid 4032
************************ BEGIN STACKS **************************
[New LWP 4033]
[New LWP 4034]
[New LWP 4035]
[New LWP 4036]
[New LWP 4037]
[New LWP 4044]
[New LWP 4045]
[New LWP 4046]
[New LWP 4049]
[New LWP 4050]
[New LWP 4051]
[New LWP 4052]
[New LWP 4053]
[New LWP 4054]
[New LWP 4055]
[New LWP 4056]
[New LWP 4057]
[New LWP 4058]
[New LWP 4059]
[New LWP 4060]
[New LWP 4061]
[New LWP 4062]
[New LWP 4063]
[New LWP 4064]
[New LWP 4065]
[New LWP 4066]
[New LWP 4067]
[New LWP 4068]
[New LWP 4069]
[New LWP 4070]
[New LWP 4071]
[New LWP 4072]
[New LWP 4073]
[New LWP 4074]
[New LWP 4075]
[New LWP 4076]
[New LWP 4077]
[New LWP 4078]
[New LWP 4079]
[New LWP 4080]
[New LWP 4081]
[New LWP 4082]
[New LWP 4083]
[New LWP 4084]
[New LWP 4085]
[New LWP 4086]
[New LWP 4087]
[New LWP 4088]
[New LWP 4089]
[New LWP 4090]
[New LWP 4091]
[New LWP 4092]
[New LWP 4093]
[New LWP 4094]
[New LWP 4095]
[New LWP 4096]
[New LWP 4097]
[New LWP 4098]
[New LWP 4099]
[New LWP 4100]
[New LWP 4101]
[New LWP 4102]
[New LWP 4103]
[New LWP 4104]
[New LWP 4105]
[New LWP 4106]
[New LWP 4107]
[New LWP 4108]
[New LWP 4109]
[New LWP 4110]
[New LWP 4111]
[New LWP 4112]
[New LWP 4113]
[New LWP 4114]
[New LWP 4115]
[New LWP 4116]
[New LWP 4117]
[New LWP 4118]
[New LWP 4119]
[New LWP 4120]
[New LWP 4121]
[New LWP 4122]
[New LWP 4123]
[New LWP 4124]
[New LWP 4125]
[New LWP 4126]
[New LWP 4127]
[New LWP 4128]
[New LWP 4129]
[New LWP 4130]
[New LWP 4131]
[New LWP 4132]
[New LWP 4133]
[New LWP 4134]
[New LWP 4135]
[New LWP 4136]
[New LWP 4137]
[New LWP 4138]
[New LWP 4139]
[New LWP 4140]
[New LWP 4141]
[New LWP 4142]
[New LWP 4143]
[New LWP 4144]
[New LWP 4145]
[New LWP 4146]
[New LWP 4147]
[New LWP 4148]
[New LWP 4149]
[New LWP 4150]
[New LWP 4151]
[New LWP 4152]
[New LWP 4153]
[New LWP 4154]
[New LWP 4155]
[New LWP 4156]
[New LWP 4157]
[New LWP 4158]
[New LWP 4159]
[New LWP 4160]
[New LWP 4161]
[New LWP 4162]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f8ec558bd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 4032 "kudu"   0x00007f8ec558bd50 in ?? ()
  2    LWP 4033 "kudu"   0x00007f8ec094f7a0 in ?? ()
  3    LWP 4034 "kudu"   0x00007f8ec5587fb9 in ?? ()
  4    LWP 4035 "kudu"   0x00007f8ec5587fb9 in ?? ()
  5    LWP 4036 "kudu"   0x00007f8ec5587fb9 in ?? ()
  6    LWP 4037 "kernel-watcher-" 0x00007f8ec5587fb9 in ?? ()
  7    LWP 4044 "ntp client-4044" 0x00007f8ec558b9e2 in ?? ()
  8    LWP 4045 "file cache-evic" 0x00007f8ec5587fb9 in ?? ()
  9    LWP 4046 "sq_acceptor" 0x00007f8ec097fcb9 in ?? ()
  10   LWP 4049 "rpc reactor-404" 0x00007f8ec098ca47 in ?? ()
  11   LWP 4050 "rpc reactor-405" 0x00007f8ec098ca47 in ?? ()
  12   LWP 4051 "rpc reactor-405" 0x00007f8ec098ca47 in ?? ()
  13   LWP 4052 "rpc reactor-405" 0x00007f8ec098ca47 in ?? ()
  14   LWP 4053 "MaintenanceMgr " 0x00007f8ec5587ad3 in ?? ()
  15   LWP 4054 "txn-status-mana" 0x00007f8ec5587fb9 in ?? ()
  16   LWP 4055 "collect_and_rem" 0x00007f8ec5587fb9 in ?? ()
  17   LWP 4056 "tc-session-exp-" 0x00007f8ec5587fb9 in ?? ()
  18   LWP 4057 "rpc worker-4057" 0x00007f8ec5587ad3 in ?? ()
  19   LWP 4058 "rpc worker-4058" 0x00007f8ec5587ad3 in ?? ()
  20   LWP 4059 "rpc worker-4059" 0x00007f8ec5587ad3 in ?? ()
  21   LWP 4060 "rpc worker-4060" 0x00007f8ec5587ad3 in ?? ()
  22   LWP 4061 "rpc worker-4061" 0x00007f8ec5587ad3 in ?? ()
  23   LWP 4062 "rpc worker-4062" 0x00007f8ec5587ad3 in ?? ()
  24   LWP 4063 "rpc worker-4063" 0x00007f8ec5587ad3 in ?? ()
  25   LWP 4064 "rpc worker-4064" 0x00007f8ec5587ad3 in ?? ()
  26   LWP 4065 "rpc worker-4065" 0x00007f8ec5587ad3 in ?? ()
  27   LWP 4066 "rpc worker-4066" 0x00007f8ec5587ad3 in ?? ()
  28   LWP 4067 "rpc worker-4067" 0x00007f8ec5587ad3 in ?? ()
  29   LWP 4068 "rpc worker-4068" 0x00007f8ec5587ad3 in ?? ()
  30   LWP 4069 "rpc worker-4069" 0x00007f8ec5587ad3 in ?? ()
  31   LWP 4070 "rpc worker-4070" 0x00007f8ec5587ad3 in ?? ()
  32   LWP 4071 "rpc worker-4071" 0x00007f8ec5587ad3 in ?? ()
  33   LWP 4072 "rpc worker-4072" 0x00007f8ec5587ad3 in ?? ()
  34   LWP 4073 "rpc worker-4073" 0x00007f8ec5587ad3 in ?? ()
  35   LWP 4074 "rpc worker-4074" 0x00007f8ec5587ad3 in ?? ()
  36   LWP 4075 "rpc worker-4075" 0x00007f8ec5587ad3 in ?? ()
  37   LWP 4076 "rpc worker-4076" 0x00007f8ec5587ad3 in ?? ()
  38   LWP 4077 "rpc worker-4077" 0x00007f8ec5587ad3 in ?? ()
  39   LWP 4078 "rpc worker-4078" 0x00007f8ec5587ad3 in ?? ()
  40   LWP 4079 "rpc worker-4079" 0x00007f8ec5587ad3 in ?? ()
  41   LWP 4080 "rpc worker-4080" 0x00007f8ec5587ad3 in ?? ()
  42   LWP 4081 "rpc worker-4081" 0x00007f8ec5587ad3 in ?? ()
  43   LWP 4082 "rpc worker-4082" 0x00007f8ec5587ad3 in ?? ()
  44   LWP 4083 "rpc worker-4083" 0x00007f8ec5587ad3 in ?? ()
  45   LWP 4084 "rpc worker-4084" 0x00007f8ec5587ad3 in ?? ()
  46   LWP 4085 "rpc worker-4085" 0x00007f8ec5587ad3 in ?? ()
  47   LWP 4086 "rpc worker-4086" 0x00007f8ec5587ad3 in ?? ()
  48   LWP 4087 "rpc worker-4087" 0x00007f8ec5587ad3 in ?? ()
  49   LWP 4088 "rpc worker-4088" 0x00007f8ec5587ad3 in ?? ()
  50   LWP 4089 "rpc worker-4089" 0x00007f8ec5587ad3 in ?? ()
  51   LWP 4090 "rpc worker-4090" 0x00007f8ec5587ad3 in ?? ()
  52   LWP 4091 "rpc worker-4091" 0x00007f8ec5587ad3 in ?? ()
  53   LWP 4092 "rpc worker-4092" 0x00007f8ec5587ad3 in ?? ()
  54   LWP 4093 "rpc worker-4093" 0x00007f8ec5587ad3 in ?? ()
  55   LWP 4094 "rpc worker-4094" 0x00007f8ec5587ad3 in ?? ()
  56   LWP 4095 "rpc worker-4095" 0x00007f8ec5587ad3 in ?? ()
  57   LWP 4096 "rpc worker-4096" 0x00007f8ec5587ad3 in ?? ()
  58   LWP 4097 "rpc worker-4097" 0x00007f8ec5587ad3 in ?? ()
  59   LWP 4098 "rpc worker-4098" 0x00007f8ec5587ad3 in ?? ()
  60   LWP 4099 "rpc worker-4099" 0x00007f8ec5587ad3 in ?? ()
  61   LWP 4100 "rpc worker-4100" 0x00007f8ec5587ad3 in ?? ()
  62   LWP 4101 "rpc worker-4101" 0x00007f8ec5587ad3 in ?? ()
  63   LWP 4102 "rpc worker-4102" 0x00007f8ec5587ad3 in ?? ()
  64   LWP 4103 "rpc worker-4103" 0x00007f8ec5587ad3 in ?? ()
  65   LWP 4104 "rpc worker-4104" 0x00007f8ec5587ad3 in ?? ()
  66   LWP 4105 "rpc worker-4105" 0x00007f8ec5587ad3 in ?? ()
  67   LWP 4106 "rpc worker-4106" 0x00007f8ec5587ad3 in ?? ()
  68   LWP 4107 "rpc worker-4107" 0x00007f8ec5587ad3 in ?? ()
  69   LWP 4108 "rpc worker-4108" 0x00007f8ec5587ad3 in ?? ()
  70   LWP 4109 "rpc worker-4109" 0x00007f8ec5587ad3 in ?? ()
  71   LWP 4110 "rpc worker-4110" 0x00007f8ec5587ad3 in ?? ()
  72   LWP 4111 "rpc worker-4111" 0x00007f8ec5587ad3 in ?? ()
  73   LWP 4112 "rpc worker-4112" 0x00007f8ec5587ad3 in ?? ()
  74   LWP 4113 "rpc worker-4113" 0x00007f8ec5587ad3 in ?? ()
  75   LWP 4114 "rpc worker-4114" 0x00007f8ec5587ad3 in ?? ()
  76   LWP 4115 "rpc worker-4115" 0x00007f8ec5587ad3 in ?? ()
  77   LWP 4116 "rpc worker-4116" 0x00007f8ec5587ad3 in ?? ()
  78   LWP 4117 "rpc worker-4117" 0x00007f8ec5587ad3 in ?? ()
  79   LWP 4118 "rpc worker-4118" 0x00007f8ec5587ad3 in ?? ()
  80   LWP 4119 "rpc worker-4119" 0x00007f8ec5587ad3 in ?? ()
  81   LWP 4120 "rpc worker-4120" 0x00007f8ec5587ad3 in ?? ()
  82   LWP 4121 "rpc worker-4121" 0x00007f8ec5587ad3 in ?? ()
  83   LWP 4122 "rpc worker-4122" 0x00007f8ec5587ad3 in ?? ()
  84   LWP 4123 "rpc worker-4123" 0x00007f8ec5587ad3 in ?? ()
  85   LWP 4124 "rpc worker-4124" 0x00007f8ec5587ad3 in ?? ()
  86   LWP 4125 "rpc worker-4125" 0x00007f8ec5587ad3 in ?? ()
  87   LWP 4126 "rpc worker-4126" 0x00007f8ec5587ad3 in ?? ()
  88   LWP 4127 "rpc worker-4127" 0x00007f8ec5587ad3 in ?? ()
  89   LWP 4128 "rpc worker-4128" 0x00007f8ec5587ad3 in ?? ()
  90   LWP 4129 "rpc worker-4129" 0x00007f8ec5587ad3 in ?? ()
  91   LWP 4130 "rpc worker-4130" 0x00007f8ec5587ad3 in ?? ()
  92   LWP 4131 "rpc worker-4131" 0x00007f8ec5587ad3 in ?? ()
  93   LWP 4132 "rpc worker-4132" 0x00007f8ec5587ad3 in ?? ()
  94   LWP 4133 "rpc worker-4133" 0x00007f8ec5587ad3 in ?? ()
  95   LWP 4134 "rpc worker-4134" 0x00007f8ec5587ad3 in ?? ()
  96   LWP 4135 "rpc worker-4135" 0x00007f8ec5587ad3 in ?? ()
  97   LWP 4136 "rpc worker-4136" 0x00007f8ec5587ad3 in ?? ()
  98   LWP 4137 "rpc worker-4137" 0x00007f8ec5587ad3 in ?? ()
  99   LWP 4138 "rpc worker-4138" 0x00007f8ec5587ad3 in ?? ()
  100  LWP 4139 "rpc worker-4139" 0x00007f8ec5587ad3 in ?? ()
  101  LWP 4140 "rpc worker-4140" 0x00007f8ec5587ad3 in ?? ()
  102  LWP 4141 "rpc worker-4141" 0x00007f8ec5587ad3 in ?? ()
  103  LWP 4142 "rpc worker-4142" 0x00007f8ec5587ad3 in ?? ()
  104  LWP 4143 "rpc worker-4143" 0x00007f8ec5587ad3 in ?? ()
  105  LWP 4144 "rpc worker-4144" 0x00007f8ec5587ad3 in ?? ()
  106  LWP 4145 "rpc worker-4145" 0x00007f8ec5587ad3 in ?? ()
  107  LWP 4146 "rpc worker-4146" 0x00007f8ec5587ad3 in ?? ()
  108  LWP 4147 "rpc worker-4147" 0x00007f8ec5587ad3 in ?? ()
  109  LWP 4148 "rpc worker-4148" 0x00007f8ec5587ad3 in ?? ()
  110  LWP 4149 "rpc worker-4149" 0x00007f8ec5587ad3 in ?? ()
  111  LWP 4150 "rpc worker-4150" 0x00007f8ec5587ad3 in ?? ()
  112  LWP 4151 "rpc worker-4151" 0x00007f8ec5587ad3 in ?? ()
  113  LWP 4152 "rpc worker-4152" 0x00007f8ec5587ad3 in ?? ()
  114  LWP 4153 "rpc worker-4153" 0x00007f8ec5587ad3 in ?? ()
  115  LWP 4154 "rpc worker-4154" 0x00007f8ec5587ad3 in ?? ()
  116  LWP 4155 "rpc worker-4155" 0x00007f8ec5587ad3 in ?? ()
  117  LWP 4156 "rpc worker-4156" 0x00007f8ec5587ad3 in ?? ()
  118  LWP 4157 "diag-logger-415" 0x00007f8ec5587fb9 in ?? ()
  119  LWP 4158 "result-tracker-" 0x00007f8ec5587fb9 in ?? ()
  120  LWP 4159 "excess-log-dele" 0x00007f8ec5587fb9 in ?? ()
  121  LWP 4160 "acceptor-4160" 0x00007f8ec098e0c7 in ?? ()
  122  LWP 4161 "heartbeat-4161" 0x00007f8ec5587fb9 in ?? ()
  123  LWP 4162 "maintenance_sch" 0x00007f8ec5587fb9 in ?? ()

Thread 123 (LWP 4162):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f0100000000 in ?? ()
#2  0x0000000000000101 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f8e797b96c0 in ?? ()
#6  0x0000000000000202 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 4161):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 4160):
#0  0x00007f8ec098e0c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 4159):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8e7afbc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcd87f77b0 in ?? ()
#5  0x00007f8e7afbc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 4158):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000085352f88 in ?? ()
#2  0x0000000000000040 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f8e7b7bd800 in ?? ()
#6  0x0000000000000080 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 4157):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8ebea0e008 in ?? ()
#2  0x0000000000000040 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f8e7bfbe750 in ?? ()
#6  0x0000000000000080 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 4156):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 4155):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 4154):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 4153):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 4152):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 4151):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 4150):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 4149):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 4148):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 4147):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 4146):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 4145):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 4144):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 4143):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 4142):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 4141):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 4140):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 4139):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 4138):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 4137):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 4136):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 4135):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 4134):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 4133):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 4132):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 4131):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 4130):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 4129):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 4128):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 4127):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 4126):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 4125):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 4124):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 4123):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 4122):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 4121):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 4120):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 4119):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 4118):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 4117):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 4116):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000766 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f8e911ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e911ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 4115):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000005 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f8e919bb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e919bb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 75 (LWP 4114):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x000000000000096f in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010d7dc in ?? ()
#4  0x00007f8e921bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e921bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 74 (LWP 4113):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 4112):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 4111):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 4110):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 4109):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 4108):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 4107):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 4106):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 4105):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 4104):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 4103):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 4102):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 4101):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 4100):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 4099):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 4098):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 4097):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 4096):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f8e9b5bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e9b5bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c2 in ?? ()
#9  0x00007f8ec5587770 in ?? ()
#10 0x00007f8e9b5bc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 4095):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 4094):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 4093):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 4092):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 4091):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 4090):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 4089):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 4088):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 4087):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 4086):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 4085):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 4084):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 4083):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 4082):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 4081):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 4080):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 4079):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 4078):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 4077):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 4076):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000003 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f8ea59be710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8ea59be730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f8ec5587770 in ?? ()
#10 0x00007f8ea59be730 in ?? ()
#11 0x00007f8eb65b5c60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 4075):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 4074):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 4073):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 4072):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 4071):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 4070):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 4069):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 4068):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 4067):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 4066):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 4065):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 4064):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 4063):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 4062):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 4061):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 4060):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 4059):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 4058):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 4057):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 4056):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000017a335c0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f8eaff8e700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 4055):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8eb078f9a8 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000372d8 in ?? ()
#5  0x00007f8eb078f840 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 4054):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f8eb0f90410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 4053):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 4052):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 4051):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 4050):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 4049):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 4046):
#0  0x00007f8ec097fcb9 in ?? ()
#1  0x00007f8eb97bcc10 in ?? ()
#2  0x00007b040000a050 in ?? ()
#3  0x00007f8eb97bdb80 in ?? ()
#4  0x00007f8eb97bcc10 in ?? ()
#5  0x00007b040000a050 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f8ebe402000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f8eb97bdb80 in ?? ()
#10 0x00007f8eca364908 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 4045):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 7 (LWP 4044):
#0  0x00007f8ec558b9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 4037):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8eba7bea40 in ?? ()
#2  0x0000000000000148 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400035b98 in ?? ()
#5  0x00007f8eba7be5d0 in ?? ()
#6  0x0000000000000290 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 4036):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 4035):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 4034):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 4033):
#0  0x00007f8ec094f7a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 4032):
#0  0x00007f8ec558bd50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f8ebfbadcc0 in ?? ()
#4  0x00007f8ebfbadcc0 in ?? ()
#5  0x00007ffcd87f75c0 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a94924 in ?? ()
#9  0x00007f8ebfbadcc0 in ?? ()
#10 0x00007f8ec3ab3f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 19:59:12.704610  3133 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID 9303e28334fc44629868721544e0dc96 and pid 4165
************************ BEGIN STACKS **************************
[New LWP 4166]
[New LWP 4167]
[New LWP 4168]
[New LWP 4169]
[New LWP 4171]
[New LWP 4178]
[New LWP 4179]
[New LWP 4180]
[New LWP 4183]
[New LWP 4184]
[New LWP 4185]
[New LWP 4186]
[New LWP 4187]
[New LWP 4188]
[New LWP 4189]
[New LWP 4190]
[New LWP 4191]
[New LWP 4192]
[New LWP 4193]
[New LWP 4194]
[New LWP 4195]
[New LWP 4196]
[New LWP 4197]
[New LWP 4198]
[New LWP 4199]
[New LWP 4200]
[New LWP 4201]
[New LWP 4202]
[New LWP 4203]
[New LWP 4204]
[New LWP 4205]
[New LWP 4206]
[New LWP 4207]
[New LWP 4208]
[New LWP 4209]
[New LWP 4210]
[New LWP 4211]
[New LWP 4212]
[New LWP 4213]
[New LWP 4214]
[New LWP 4215]
[New LWP 4216]
[New LWP 4217]
[New LWP 4218]
[New LWP 4219]
[New LWP 4220]
[New LWP 4221]
[New LWP 4222]
[New LWP 4223]
[New LWP 4224]
[New LWP 4225]
[New LWP 4226]
[New LWP 4227]
[New LWP 4228]
[New LWP 4229]
[New LWP 4230]
[New LWP 4231]
[New LWP 4232]
[New LWP 4233]
[New LWP 4234]
[New LWP 4235]
[New LWP 4236]
[New LWP 4237]
[New LWP 4238]
[New LWP 4239]
[New LWP 4240]
[New LWP 4241]
[New LWP 4242]
[New LWP 4243]
[New LWP 4244]
[New LWP 4245]
[New LWP 4246]
[New LWP 4247]
[New LWP 4248]
[New LWP 4249]
[New LWP 4250]
[New LWP 4251]
[New LWP 4252]
[New LWP 4253]
[New LWP 4254]
[New LWP 4255]
[New LWP 4256]
[New LWP 4257]
[New LWP 4258]
[New LWP 4259]
[New LWP 4260]
[New LWP 4261]
[New LWP 4262]
[New LWP 4263]
[New LWP 4264]
[New LWP 4265]
[New LWP 4266]
[New LWP 4267]
[New LWP 4268]
[New LWP 4269]
[New LWP 4270]
[New LWP 4271]
[New LWP 4272]
[New LWP 4273]
[New LWP 4274]
[New LWP 4275]
[New LWP 4276]
[New LWP 4277]
[New LWP 4278]
[New LWP 4279]
[New LWP 4280]
[New LWP 4281]
[New LWP 4282]
[New LWP 4283]
[New LWP 4284]
[New LWP 4285]
[New LWP 4286]
[New LWP 4287]
[New LWP 4288]
[New LWP 4289]
[New LWP 4290]
[New LWP 4291]
[New LWP 4292]
[New LWP 4293]
[New LWP 4294]
[New LWP 4295]
[New LWP 4296]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f07b4aaad50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 4165 "kudu"   0x00007f07b4aaad50 in ?? ()
  2    LWP 4166 "kudu"   0x00007f07afe6e7a0 in ?? ()
  3    LWP 4167 "kudu"   0x00007f07b4aa6fb9 in ?? ()
  4    LWP 4168 "kudu"   0x00007f07b4aa6fb9 in ?? ()
  5    LWP 4169 "kudu"   0x00007f07b4aa6fb9 in ?? ()
  6    LWP 4171 "kernel-watcher-" 0x00007f07b4aa6fb9 in ?? ()
  7    LWP 4178 "ntp client-4178" 0x00007f07b4aaa9e2 in ?? ()
  8    LWP 4179 "file cache-evic" 0x00007f07b4aa6fb9 in ?? ()
  9    LWP 4180 "sq_acceptor" 0x00007f07afe9ecb9 in ?? ()
  10   LWP 4183 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  11   LWP 4184 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  12   LWP 4185 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  13   LWP 4186 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  14   LWP 4187 "MaintenanceMgr " 0x00007f07b4aa6ad3 in ?? ()
  15   LWP 4188 "txn-status-mana" 0x00007f07b4aa6fb9 in ?? ()
  16   LWP 4189 "collect_and_rem" 0x00007f07b4aa6fb9 in ?? ()
  17   LWP 4190 "tc-session-exp-" 0x00007f07b4aa6fb9 in ?? ()
  18   LWP 4191 "rpc worker-4191" 0x00007f07b4aa6ad3 in ?? ()
  19   LWP 4192 "rpc worker-4192" 0x00007f07b4aa6ad3 in ?? ()
  20   LWP 4193 "rpc worker-4193" 0x00007f07b4aa6ad3 in ?? ()
  21   LWP 4194 "rpc worker-4194" 0x00007f07b4aa6ad3 in ?? ()
  22   LWP 4195 "rpc worker-4195" 0x00007f07b4aa6ad3 in ?? ()
  23   LWP 4196 "rpc worker-4196" 0x00007f07b4aa6ad3 in ?? ()
  24   LWP 4197 "rpc worker-4197" 0x00007f07b4aa6ad3 in ?? ()
  25   LWP 4198 "rpc worker-4198" 0x00007f07b4aa6ad3 in ?? ()
  26   LWP 4199 "rpc worker-4199" 0x00007f07b4aa6ad3 in ?? ()
  27   LWP 4200 "rpc worker-4200" 0x00007f07b4aa6ad3 in ?? ()
  28   LWP 4201 "rpc worker-4201" 0x00007f07b4aa6ad3 in ?? ()
  29   LWP 4202 "rpc worker-4202" 0x00007f07b4aa6ad3 in ?? ()
  30   LWP 4203 "rpc worker-4203" 0x00007f07b4aa6ad3 in ?? ()
  31   LWP 4204 "rpc worker-4204" 0x00007f07b4aa6ad3 in ?? ()
  32   LWP 4205 "rpc worker-4205" 0x00007f07b4aa6ad3 in ?? ()
  33   LWP 4206 "rpc worker-4206" 0x00007f07b4aa6ad3 in ?? ()
  34   LWP 4207 "rpc worker-4207" 0x00007f07b4aa6ad3 in ?? ()
  35   LWP 4208 "rpc worker-4208" 0x00007f07b4aa6ad3 in ?? ()
  36   LWP 4209 "rpc worker-4209" 0x00007f07b4aa6ad3 in ?? ()
  37   LWP 4210 "rpc worker-4210" 0x00007f07b4aa6ad3 in ?? ()
  38   LWP 4211 "rpc worker-4211" 0x00007f07b4aa6ad3 in ?? ()
  39   LWP 4212 "rpc worker-4212" 0x00007f07b4aa6ad3 in ?? ()
  40   LWP 4213 "rpc worker-4213" 0x00007f07b4aa6ad3 in ?? ()
  41   LWP 4214 "rpc worker-4214" 0x00007f07b4aa6ad3 in ?? ()
  42   LWP 4215 "rpc worker-4215" 0x00007f07b4aa6ad3 in ?? ()
  43   LWP 4216 "rpc worker-4216" 0x00007f07b4aa6ad3 in ?? ()
  44   LWP 4217 "rpc worker-4217" 0x00007f07b4aa6ad3 in ?? ()
  45   LWP 4218 "rpc worker-4218" 0x00007f07b4aa6ad3 in ?? ()
  46   LWP 4219 "rpc worker-4219" 0x00007f07b4aa6ad3 in ?? ()
  47   LWP 4220 "rpc worker-4220" 0x00007f07b4aa6ad3 in ?? ()
  48   LWP 4221 "rpc worker-4221" 0x00007f07b4aa6ad3 in ?? ()
  49   LWP 4222 "rpc worker-4222" 0x00007f07b4aa6ad3 in ?? ()
  50   LWP 4223 "rpc worker-4223" 0x00007f07b4aa6ad3 in ?? ()
  51   LWP 4224 "rpc worker-4224" 0x00007f07b4aa6ad3 in ?? ()
  52   LWP 4225 "rpc worker-4225" 0x00007f07b4aa6ad3 in ?? ()
  53   LWP 4226 "rpc worker-4226" 0x00007f07b4aa6ad3 in ?? ()
  54   LWP 4227 "rpc worker-4227" 0x00007f07b4aa6ad3 in ?? ()
  55   LWP 4228 "rpc worker-4228" 0x00007f07b4aa6ad3 in ?? ()
  56   LWP 4229 "rpc worker-4229" 0x00007f07b4aa6ad3 in ?? ()
  57   LWP 4230 "rpc worker-4230" 0x00007f07b4aa6ad3 in ?? ()
  58   LWP 4231 "rpc worker-4231" 0x00007f07b4aa6ad3 in ?? ()
  59   LWP 4232 "rpc worker-4232" 0x00007f07b4aa6ad3 in ?? ()
  60   LWP 4233 "rpc worker-4233" 0x00007f07b4aa6ad3 in ?? ()
  61   LWP 4234 "rpc worker-4234" 0x00007f07b4aa6ad3 in ?? ()
  62   LWP 4235 "rpc worker-4235" 0x00007f07b4aa6ad3 in ?? ()
  63   LWP 4236 "rpc worker-4236" 0x00007f07b4aa6ad3 in ?? ()
  64   LWP 4237 "rpc worker-4237" 0x00007f07b4aa6ad3 in ?? ()
  65   LWP 4238 "rpc worker-4238" 0x00007f07b4aa6ad3 in ?? ()
  66   LWP 4239 "rpc worker-4239" 0x00007f07b4aa6ad3 in ?? ()
  67   LWP 4240 "rpc worker-4240" 0x00007f07b4aa6ad3 in ?? ()
  68   LWP 4241 "rpc worker-4241" 0x00007f07b4aa6ad3 in ?? ()
  69   LWP 4242 "rpc worker-4242" 0x00007f07b4aa6ad3 in ?? ()
  70   LWP 4243 "rpc worker-4243" 0x00007f07b4aa6ad3 in ?? ()
  71   LWP 4244 "rpc worker-4244" 0x00007f07b4aa6ad3 in ?? ()
  72   LWP 4245 "rpc worker-4245" 0x00007f07b4aa6ad3 in ?? ()
  73   LWP 4246 "rpc worker-4246" 0x00007f07b4aa6ad3 in ?? ()
  74   LWP 4247 "rpc worker-4247" 0x00007f07b4aa6ad3 in ?? ()
  75   LWP 4248 "rpc worker-4248" 0x00007f07b4aa6ad3 in ?? ()
  76   LWP 4249 "rpc worker-4249" 0x00007f07b4aa6ad3 in ?? ()
  77   LWP 4250 "rpc worker-4250" 0x00007f07b4aa6ad3 in ?? ()
  78   LWP 4251 "rpc worker-4251" 0x00007f07b4aa6ad3 in ?? ()
  79   LWP 4252 "rpc worker-4252" 0x00007f07b4aa6ad3 in ?? ()
  80   LWP 4253 "rpc worker-4253" 0x00007f07b4aa6ad3 in ?? ()
  81   LWP 4254 "rpc worker-4254" 0x00007f07b4aa6ad3 in ?? ()
  82   LWP 4255 "rpc worker-4255" 0x00007f07b4aa6ad3 in ?? ()
  83   LWP 4256 "rpc worker-4256" 0x00007f07b4aa6ad3 in ?? ()
  84   LWP 4257 "rpc worker-4257" 0x00007f07b4aa6ad3 in ?? ()
  85   LWP 4258 "rpc worker-4258" 0x00007f07b4aa6ad3 in ?? ()
  86   LWP 4259 "rpc worker-4259" 0x00007f07b4aa6ad3 in ?? ()
  87   LWP 4260 "rpc worker-4260" 0x00007f07b4aa6ad3 in ?? ()
  88   LWP 4261 "rpc worker-4261" 0x00007f07b4aa6ad3 in ?? ()
  89   LWP 4262 "rpc worker-4262" 0x00007f07b4aa6ad3 in ?? ()
  90   LWP 4263 "rpc worker-4263" 0x00007f07b4aa6ad3 in ?? ()
  91   LWP 4264 "rpc worker-4264" 0x00007f07b4aa6ad3 in ?? ()
  92   LWP 4265 "rpc worker-4265" 0x00007f07b4aa6ad3 in ?? ()
  93   LWP 4266 "rpc worker-4266" 0x00007f07b4aa6ad3 in ?? ()
  94   LWP 4267 "rpc worker-4267" 0x00007f07b4aa6ad3 in ?? ()
  95   LWP 4268 "rpc worker-4268" 0x00007f07b4aa6ad3 in ?? ()
  96   LWP 4269 "rpc worker-4269" 0x00007f07b4aa6ad3 in ?? ()
  97   LWP 4270 "rpc worker-4270" 0x00007f07b4aa6ad3 in ?? ()
  98   LWP 4271 "rpc worker-4271" 0x00007f07b4aa6ad3 in ?? ()
  99   LWP 4272 "rpc worker-4272" 0x00007f07b4aa6ad3 in ?? ()
  100  LWP 4273 "rpc worker-4273" 0x00007f07b4aa6ad3 in ?? ()
  101  LWP 4274 "rpc worker-4274" 0x00007f07b4aa6ad3 in ?? ()
  102  LWP 4275 "rpc worker-4275" 0x00007f07b4aa6ad3 in ?? ()
  103  LWP 4276 "rpc worker-4276" 0x00007f07b4aa6ad3 in ?? ()
  104  LWP 4277 "rpc worker-4277" 0x00007f07b4aa6ad3 in ?? ()
  105  LWP 4278 "rpc worker-4278" 0x00007f07b4aa6ad3 in ?? ()
  106  LWP 4279 "rpc worker-4279" 0x00007f07b4aa6ad3 in ?? ()
  107  LWP 4280 "rpc worker-4280" 0x00007f07b4aa6ad3 in ?? ()
  108  LWP 4281 "rpc worker-4281" 0x00007f07b4aa6ad3 in ?? ()
  109  LWP 4282 "rpc worker-4282" 0x00007f07b4aa6ad3 in ?? ()
  110  LWP 4283 "rpc worker-4283" 0x00007f07b4aa6ad3 in ?? ()
  111  LWP 4284 "rpc worker-4284" 0x00007f07b4aa6ad3 in ?? ()
  112  LWP 4285 "rpc worker-4285" 0x00007f07b4aa6ad3 in ?? ()
  113  LWP 4286 "rpc worker-4286" 0x00007f07b4aa6ad3 in ?? ()
  114  LWP 4287 "rpc worker-4287" 0x00007f07b4aa6ad3 in ?? ()
  115  LWP 4288 "rpc worker-4288" 0x00007f07b4aa6ad3 in ?? ()
  116  LWP 4289 "rpc worker-4289" 0x00007f07b4aa6ad3 in ?? ()
  117  LWP 4290 "rpc worker-4290" 0x00007f07b4aa6ad3 in ?? ()
  118  LWP 4291 "diag-logger-429" 0x00007f07b4aa6fb9 in ?? ()
  119  LWP 4292 "result-tracker-" 0x00007f07b4aa6fb9 in ?? ()
  120  LWP 4293 "excess-log-dele" 0x00007f07b4aa6fb9 in ?? ()
  121  LWP 4294 "acceptor-4294" 0x00007f07afead0c7 in ?? ()
  122  LWP 4295 "heartbeat-4295" 0x00007f07b4aa6fb9 in ?? ()
  123  LWP 4296 "maintenance_sch" 0x00007f07b4aa6fb9 in ?? ()

Thread 123 (LWP 4296):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007b0100000000 in ?? ()
#2  0x00000000000000fd in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f0768cb96c0 in ?? ()
#6  0x00000000000001fa in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 4295):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 4294):
#0  0x00007f07afead0c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 4293):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f076a4bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007fff876ef870 in ?? ()
#5  0x00007f076a4bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 4292):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000085352f88 in ?? ()
#2  0x000000000000003f in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f076acbd800 in ?? ()
#6  0x000000000000007e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 4291):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f07adf0e008 in ?? ()
#2  0x000000000000003b in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f076b4be750 in ?? ()
#6  0x0000000000000076 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 4290):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 4289):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 4288):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 4287):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 4286):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 4285):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 4284):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 4283):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 4282):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 4281):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 4280):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 4279):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 4278):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 4277):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 4276):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 4275):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 4274):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 4273):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 4272):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 4271):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 4270):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 4269):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 4268):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 4267):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 4266):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 4265):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 4264):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 4263):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 4262):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 4261):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 4260):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 4259):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 4258):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 4257):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 4256):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 4255):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 4254):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 4253):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 4252):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 4251):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 4250):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x00000000000008be in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f07806ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f07806ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 4249):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x00000000000007bd in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f0780ebb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0780ebb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f07b4aa6770 in ?? ()
#10 0x00007f0780ebb730 in ?? ()
#11 0x00007f0764bd5278 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 4248):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010d7dc in ?? ()
#4  0x00007f07816bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f07816bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 74 (LWP 4247):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 4246):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 4245):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 4244):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 4243):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 4242):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 4241):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 4240):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 4239):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 4238):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 4237):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 4236):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 4235):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 4234):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 4233):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 4232):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 4231):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 4230):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f078aabc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f078aabc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c2 in ?? ()
#9  0x00007f07b4aa6770 in ?? ()
#10 0x00007f078aabc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 4229):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 4228):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 4227):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 4226):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 4225):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 4224):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 4223):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 4222):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 4221):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 4220):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 4219):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 4218):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 4217):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 4216):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 4215):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 4214):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 4213):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 4212):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 4211):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 4210):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b240005ffe8 in ?? ()
#4  0x00007f0794ebe710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0794ebe730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 36 (LWP 4209):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005d7fc in ?? ()
#4  0x00007f07958b6710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f07958b6730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f07b4aa6770 in ?? ()
#10 0x00007f07958b6730 in ?? ()
#11 0x00007f07ace97c48 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 4208):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 4207):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 4206):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 4205):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 4204):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 4203):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 4202):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 4201):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 4200):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 4199):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 4198):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 4197):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 4196):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 4195):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 4194):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 4193):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 4192):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 4191):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 4190):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000017a335c0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f079f48e700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 4189):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f079fc8f9a8 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000372d8 in ?? ()
#5  0x00007f079fc8f840 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 4188):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f07a0490410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 4187):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 4186):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 4185):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 4184):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 4183):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 4180):
#0  0x00007f07afe9ecb9 in ?? ()
#1  0x00007f07a8cbcc10 in ?? ()
#2  0x00007b040000a850 in ?? ()
#3  0x00007f07a8cbdb80 in ?? ()
#4  0x00007f07a8cbcc10 in ?? ()
#5  0x00007b040000a850 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f07ad910000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f07a8cbdb80 in ?? ()
#10 0x00007f07b9883908 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 4179):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000600000000000 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400034018 in ?? ()
#5  0x00007f07a84bb7f0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 7 (LWP 4178):
#0  0x00007f07b4aaa9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 4171):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f07a9cbea40 in ?? ()
#2  0x0000000000000144 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400035b98 in ?? ()
#5  0x00007f07a9cbe5d0 in ?? ()
#6  0x0000000000000288 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 4169):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 4168):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 4167):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 4166):
#0  0x00007f07afe6e7a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 4165):
#0  0x00007f07b4aaad50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f07af0cccc0 in ?? ()
#4  0x00007f07af0cccc0 in ?? ()
#5  0x00007fff876ef680 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a9a15b in ?? ()
#9  0x00007f07af0cccc0 in ?? ()
#10 0x00007f07b2fd2f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 19:59:13.687809  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3899
I20250624 19:59:13.738415  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 4032
I20250624 19:59:13.788393  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 4165
I20250624 19:59:13.841274  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3807
2025-06-24T19:59:13Z chronyd exiting
I20250624 19:59:13.898190  3133 test_util.cc:183] -----------------------------------------------
I20250624 19:59:13.898419  3133 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0

Full log

Note: This is test shard 1 of 6.
[==========] Running 5 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 4 tests from TabletCopyITest
[ RUN      ] TabletCopyITest.TestRejectRogueLeader
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:172: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[  SKIPPED ] TabletCopyITest.TestRejectRogueLeader (14 ms)
[ RUN      ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:727: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[  SKIPPED ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest (6 ms)
[ RUN      ] TabletCopyITest.TestTabletCopyThrottling
2025-06-24T19:57:38Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T19:57:38Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20250624 19:57:38.311332  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.3.15.126:39801
--webserver_interface=127.3.15.126
--webserver_port=0
--builtin_ntp_servers=127.3.15.84:38311
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.3.15.126:39801
--master_tombstone_evicted_tablet_replicas=false with env {}
W20250624 19:57:38.628721  3142 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:57:38.629355  3142 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:57:38.629815  3142 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:57:38.661680  3142 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 19:57:38.662022  3142 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:57:38.662310  3142 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 19:57:38.662557  3142 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 19:57:38.698589  3142 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:38311
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
--master_tombstone_evicted_tablet_replicas=false
--ipki_ca_key_size=768
--master_addresses=127.3.15.126:39801
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.3.15.126:39801
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.3.15.126
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:57:38.700047  3142 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:57:38.701929  3142 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:57:38.719360  3148 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:38.720698  3151 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:38.721624  3142 server_base.cc:1048] running on GCE node
W20250624 19:57:38.721381  3149 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:39.929651  3142 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:57:39.932421  3142 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:57:39.933887  3142 hybrid_clock.cc:648] HybridClock initialized: now 1750795059933843 us; error 52 us; skew 500 ppm
I20250624 19:57:39.934777  3142 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:57:39.942046  3142 webserver.cc:469] Webserver started at http://127.3.15.126:34655/ using document root <none> and password file <none>
I20250624 19:57:39.943120  3142 fs_manager.cc:362] Metadata directory not provided
I20250624 19:57:39.943342  3142 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:57:39.943859  3142 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:57:39.948566  3142 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/instance:
uuid: "b198b29095774ed493443570ccef15b7"
format_stamp: "Formatted at 2025-06-24 19:57:39 on dist-test-slave-0t1p"
I20250624 19:57:39.949944  3142 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal/instance:
uuid: "b198b29095774ed493443570ccef15b7"
format_stamp: "Formatted at 2025-06-24 19:57:39 on dist-test-slave-0t1p"
I20250624 19:57:39.958281  3142 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.008s	sys 0.000s
I20250624 19:57:39.964542  3158 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:39.965857  3142 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.002s	sys 0.002s
I20250624 19:57:39.966264  3142 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
uuid: "b198b29095774ed493443570ccef15b7"
format_stamp: "Formatted at 2025-06-24 19:57:39 on dist-test-slave-0t1p"
I20250624 19:57:39.966630  3142 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:57:40.041090  3142 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:57:40.042686  3142 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:57:40.043164  3142 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:57:40.132573  3142 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.126:39801
I20250624 19:57:40.132696  3209 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.126:39801 every 8 connection(s)
I20250624 19:57:40.135776  3142 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
I20250624 19:57:40.142164  3210 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:40.144789  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3142
I20250624 19:57:40.145226  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal/instance
I20250624 19:57:40.165908  3210 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Bootstrap starting.
I20250624 19:57:40.172396  3210 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Neither blocks nor log segments found. Creating new log.
I20250624 19:57:40.174443  3210 log.cc:826] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Log is configured to *not* fsync() on all Append() calls
I20250624 19:57:40.180969  3210 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: No bootstrap required, opened a new log
I20250624 19:57:40.204376  3210 raft_consensus.cc:357] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:40.205273  3210 raft_consensus.cc:383] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:57:40.205580  3210 raft_consensus.cc:738] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b198b29095774ed493443570ccef15b7, State: Initialized, Role: FOLLOWER
I20250624 19:57:40.206434  3210 consensus_queue.cc:260] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:40.206967  3210 raft_consensus.cc:397] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:57:40.207198  3210 raft_consensus.cc:491] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:57:40.207465  3210 raft_consensus.cc:3058] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:57:40.212278  3210 raft_consensus.cc:513] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:40.213280  3210 leader_election.cc:304] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: b198b29095774ed493443570ccef15b7; no voters: 
I20250624 19:57:40.215266  3210 leader_election.cc:290] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250624 19:57:40.216941  3215 raft_consensus.cc:2802] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:57:40.220021  3215 raft_consensus.cc:695] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 LEADER]: Becoming Leader. State: Replica: b198b29095774ed493443570ccef15b7, State: Running, Role: LEADER
I20250624 19:57:40.221053  3215 consensus_queue.cc:237] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:40.224488  3210 sys_catalog.cc:564] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 19:57:40.234622  3216 sys_catalog.cc:455] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: SysCatalogTable state changed. Reason: New leader b198b29095774ed493443570ccef15b7. Latest consensus state: current_term: 1 leader_uuid: "b198b29095774ed493443570ccef15b7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } } }
I20250624 19:57:40.235471  3216 sys_catalog.cc:458] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: This master's current role is: LEADER
I20250624 19:57:40.236627  3217 sys_catalog.cc:455] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "b198b29095774ed493443570ccef15b7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } } }
I20250624 19:57:40.237516  3217 sys_catalog.cc:458] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: This master's current role is: LEADER
I20250624 19:57:40.240780  3222 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 19:57:40.255918  3222 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 19:57:40.280433  3222 catalog_manager.cc:1349] Generated new cluster ID: 7c01f129cc9a4d4ab9975fb6353d4e6b
I20250624 19:57:40.280893  3222 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 19:57:40.299153  3222 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 19:57:40.300788  3222 catalog_manager.cc:1506] Loading token signing keys...
I20250624 19:57:40.320943  3222 catalog_manager.cc:5955] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Generated new TSK 0
I20250624 19:57:40.322152  3222 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 19:57:40.337599  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.3.15.65:0
--local_ip_for_outbound_sockets=127.3.15.65
--webserver_interface=127.3.15.65
--webserver_port=0
--tserver_master_addrs=127.3.15.126:39801
--builtin_ntp_servers=127.3.15.84:38311
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
W20250624 19:57:40.678120  3234 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:57:40.678856  3234 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:57:40.679430  3234 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:57:40.713804  3234 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:57:40.714761  3234 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.3.15.65
I20250624 19:57:40.753093  3234 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:38311
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.3.15.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.3.15.65
--webserver_port=0
--tserver_master_addrs=127.3.15.126:39801
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.3.15.65
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:57:40.754590  3234 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:57:40.756479  3234 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:57:40.779711  3241 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:40.780592  3243 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:40.781307  3234 server_base.cc:1048] running on GCE node
W20250624 19:57:40.782269  3240 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:42.012230  3234 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:57:42.015405  3234 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:57:42.017023  3234 hybrid_clock.cc:648] HybridClock initialized: now 1750795062016902 us; error 102 us; skew 500 ppm
I20250624 19:57:42.018199  3234 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:57:42.027087  3234 webserver.cc:469] Webserver started at http://127.3.15.65:46599/ using document root <none> and password file <none>
I20250624 19:57:42.028147  3234 fs_manager.cc:362] Metadata directory not provided
I20250624 19:57:42.028388  3234 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:57:42.028959  3234 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:57:42.033818  3234 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data/instance:
uuid: "c9ff5371a1a14662a1a2fec8dd43e81c"
format_stamp: "Formatted at 2025-06-24 19:57:42 on dist-test-slave-0t1p"
I20250624 19:57:42.035125  3234 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal/instance:
uuid: "c9ff5371a1a14662a1a2fec8dd43e81c"
format_stamp: "Formatted at 2025-06-24 19:57:42 on dist-test-slave-0t1p"
I20250624 19:57:42.043897  3234 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.009s	sys 0.001s
I20250624 19:57:42.051280  3250 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:42.052735  3234 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.003s	sys 0.004s
I20250624 19:57:42.053136  3234 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal
uuid: "c9ff5371a1a14662a1a2fec8dd43e81c"
format_stamp: "Formatted at 2025-06-24 19:57:42 on dist-test-slave-0t1p"
I20250624 19:57:42.053507  3234 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:57:42.115972  3234 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:57:42.117728  3234 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:57:42.118245  3234 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:57:42.121799  3234 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 19:57:42.126696  3234 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 19:57:42.126991  3234 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:42.127282  3234 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 19:57:42.127449  3234 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:42.343452  3234 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.65:37511
I20250624 19:57:42.343582  3362 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.65:37511 every 8 connection(s)
I20250624 19:57:42.346467  3234 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/data/info.pb
I20250624 19:57:42.353178  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3234
I20250624 19:57:42.353665  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-0/wal/instance
I20250624 19:57:42.360867  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.3.15.66:0
--local_ip_for_outbound_sockets=127.3.15.66
--webserver_interface=127.3.15.66
--webserver_port=0
--tserver_master_addrs=127.3.15.126:39801
--builtin_ntp_servers=127.3.15.84:38311
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
I20250624 19:57:42.376533  3363 heartbeater.cc:344] Connected to a master server at 127.3.15.126:39801
I20250624 19:57:42.377269  3363 heartbeater.cc:461] Registering TS with master...
I20250624 19:57:42.379036  3363 heartbeater.cc:507] Master 127.3.15.126:39801 requested a full tablet report, sending...
I20250624 19:57:42.382680  3175 ts_manager.cc:194] Registered new tserver with Master: c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65:37511)
I20250624 19:57:42.384861  3175 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.3.15.65:37957
W20250624 19:57:42.685814  3367 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:57:42.686381  3367 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:57:42.686916  3367 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:57:42.719117  3367 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:57:42.720037  3367 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.3.15.66
I20250624 19:57:42.756199  3367 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:38311
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.3.15.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.3.15.66
--webserver_port=0
--tserver_master_addrs=127.3.15.126:39801
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.3.15.66
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:57:42.757655  3367 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:57:42.759436  3367 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:57:42.777113  3373 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:43.390202  3363 heartbeater.cc:499] Master 127.3.15.126:39801 was elected leader, sending a full tablet report...
W20250624 19:57:42.777138  3374 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:42.781100  3376 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:43.994292  3375 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 19:57:43.994338  3367 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 19:57:43.998929  3367 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:57:44.001391  3367 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:57:44.002805  3367 hybrid_clock.cc:648] HybridClock initialized: now 1750795064002749 us; error 66 us; skew 500 ppm
I20250624 19:57:44.003798  3367 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:57:44.011297  3367 webserver.cc:469] Webserver started at http://127.3.15.66:42695/ using document root <none> and password file <none>
I20250624 19:57:44.012355  3367 fs_manager.cc:362] Metadata directory not provided
I20250624 19:57:44.012676  3367 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:57:44.013242  3367 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:57:44.018041  3367 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/instance:
uuid: "86cdccf3123144cd946ced41749f1c43"
format_stamp: "Formatted at 2025-06-24 19:57:44 on dist-test-slave-0t1p"
I20250624 19:57:44.019302  3367 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal/instance:
uuid: "86cdccf3123144cd946ced41749f1c43"
format_stamp: "Formatted at 2025-06-24 19:57:44 on dist-test-slave-0t1p"
I20250624 19:57:44.027897  3367 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.005s	sys 0.004s
I20250624 19:57:44.034391  3383 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:44.035846  3367 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.005s	sys 0.000s
I20250624 19:57:44.036259  3367 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
uuid: "86cdccf3123144cd946ced41749f1c43"
format_stamp: "Formatted at 2025-06-24 19:57:44 on dist-test-slave-0t1p"
I20250624 19:57:44.036710  3367 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:57:44.093041  3367 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:57:44.095148  3367 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:57:44.095686  3367 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:57:44.098589  3367 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 19:57:44.103801  3367 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 19:57:44.104053  3367 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:44.104307  3367 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 19:57:44.104480  3367 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:44.284495  3367 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.66:35061
I20250624 19:57:44.284706  3495 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.66:35061 every 8 connection(s)
I20250624 19:57:44.287259  3367 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
I20250624 19:57:44.292852  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3367
I20250624 19:57:44.293403  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal/instance
I20250624 19:57:44.313180  3496 heartbeater.cc:344] Connected to a master server at 127.3.15.126:39801
I20250624 19:57:44.313779  3496 heartbeater.cc:461] Registering TS with master...
I20250624 19:57:44.315327  3496 heartbeater.cc:507] Master 127.3.15.126:39801 requested a full tablet report, sending...
I20250624 19:57:44.317802  3174 ts_manager.cc:194] Registered new tserver with Master: 86cdccf3123144cd946ced41749f1c43 (127.3.15.66:35061)
I20250624 19:57:44.319200  3174 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.3.15.66:43321
I20250624 19:57:44.330494  3133 external_mini_cluster.cc:934] 2 TS(s) registered with all masters
I20250624 19:57:44.370496  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3367
I20250624 19:57:44.400650  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3142
I20250624 19:57:44.436501  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.3.15.126:39801
--webserver_interface=127.3.15.126
--webserver_port=34655
--builtin_ntp_servers=127.3.15.84:38311
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.3.15.126:39801
--master_tombstone_evicted_tablet_replicas=false with env {}
W20250624 19:57:45.405330  3363 heartbeater.cc:646] Failed to heartbeat to 127.3.15.126:39801 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.3.15.126:39801: connect: Connection refused (error 111)
W20250624 19:57:46.094007  3507 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:57:46.094656  3507 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:57:46.095117  3507 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:57:46.129917  3507 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 19:57:46.130249  3507 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:57:46.130478  3507 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 19:57:46.130687  3507 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 19:57:46.170169  3507 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:38311
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
--master_tombstone_evicted_tablet_replicas=false
--ipki_ca_key_size=768
--master_addresses=127.3.15.126:39801
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.3.15.126:39801
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.3.15.126
--webserver_port=34655
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:57:46.171608  3507 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:57:46.174378  3507 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:57:46.195480  3513 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:46.195652  3514 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:46.195533  3516 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:46.200745  3507 server_base.cc:1048] running on GCE node
I20250624 19:57:47.470218  3507 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:57:47.473604  3507 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:57:47.475103  3507 hybrid_clock.cc:648] HybridClock initialized: now 1750795067475065 us; error 47 us; skew 500 ppm
I20250624 19:57:47.476032  3507 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:57:47.488898  3507 webserver.cc:469] Webserver started at http://127.3.15.126:34655/ using document root <none> and password file <none>
I20250624 19:57:47.490105  3507 fs_manager.cc:362] Metadata directory not provided
I20250624 19:57:47.490391  3507 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:57:47.499439  3507 fs_manager.cc:714] Time spent opening directory manager: real 0.006s	user 0.005s	sys 0.000s
I20250624 19:57:47.505831  3525 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:47.507246  3507 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.002s	sys 0.003s
I20250624 19:57:47.507689  3507 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
uuid: "b198b29095774ed493443570ccef15b7"
format_stamp: "Formatted at 2025-06-24 19:57:39 on dist-test-slave-0t1p"
I20250624 19:57:47.509939  3507 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:57:47.570999  3507 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:57:47.572804  3507 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:57:47.573346  3507 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:57:47.656420  3507 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.126:39801
I20250624 19:57:47.656602  3576 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.126:39801 every 8 connection(s)
I20250624 19:57:47.659770  3507 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
I20250624 19:57:47.660631  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3507
I20250624 19:57:47.676391  3577 sys_catalog.cc:263] Verifying existing consensus state
I20250624 19:57:47.682138  3577 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Bootstrap starting.
I20250624 19:57:47.727581  3577 log.cc:826] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Log is configured to *not* fsync() on all Append() calls
I20250624 19:57:47.746464  3577 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Bootstrap replayed 1/1 log segments. Stats: ops{read=4 overwritten=0 applied=4 ignored=0} inserts{seen=3 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 19:57:47.747393  3577 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Bootstrap complete.
I20250624 19:57:47.770372  3577 raft_consensus.cc:357] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:47.772943  3577 raft_consensus.cc:738] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: b198b29095774ed493443570ccef15b7, State: Initialized, Role: FOLLOWER
I20250624 19:57:47.773800  3577 consensus_queue.cc:260] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:47.774425  3577 raft_consensus.cc:397] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:57:47.774749  3577 raft_consensus.cc:491] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:57:47.775103  3577 raft_consensus.cc:3058] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 1 FOLLOWER]: Advancing to term 2
I20250624 19:57:47.783125  3577 raft_consensus.cc:513] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:47.783946  3577 leader_election.cc:304] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: b198b29095774ed493443570ccef15b7; no voters: 
I20250624 19:57:47.785828  3577 leader_election.cc:290] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [CANDIDATE]: Term 2 election: Requested vote from peers 
I20250624 19:57:47.786358  3582 raft_consensus.cc:2802] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 19:57:47.789115  3582 raft_consensus.cc:695] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [term 2 LEADER]: Becoming Leader. State: Replica: b198b29095774ed493443570ccef15b7, State: Running, Role: LEADER
I20250624 19:57:47.790024  3582 consensus_queue.cc:237] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 4, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } }
I20250624 19:57:47.790987  3577 sys_catalog.cc:564] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 19:57:47.803565  3584 sys_catalog.cc:455] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: SysCatalogTable state changed. Reason: New leader b198b29095774ed493443570ccef15b7. Latest consensus state: current_term: 2 leader_uuid: "b198b29095774ed493443570ccef15b7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } } }
I20250624 19:57:47.805387  3584 sys_catalog.cc:458] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: This master's current role is: LEADER
I20250624 19:57:47.803778  3583 sys_catalog.cc:455] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "b198b29095774ed493443570ccef15b7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b198b29095774ed493443570ccef15b7" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 39801 } } }
I20250624 19:57:47.806703  3583 sys_catalog.cc:458] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7 [sys.catalog]: This master's current role is: LEADER
I20250624 19:57:47.814059  3591 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 19:57:47.832104  3591 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 19:57:47.837491  3591 catalog_manager.cc:1261] Loaded cluster ID: 7c01f129cc9a4d4ab9975fb6353d4e6b
I20250624 19:57:47.837793  3591 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 19:57:47.844209  3591 catalog_manager.cc:1506] Loading token signing keys...
I20250624 19:57:47.847853  3591 catalog_manager.cc:5966] T 00000000000000000000000000000000 P b198b29095774ed493443570ccef15b7: Loaded TSK: 0
I20250624 19:57:47.849337  3591 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 19:57:48.431118  3363 heartbeater.cc:344] Connected to a master server at 127.3.15.126:39801
I20250624 19:57:48.437390  3541 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" instance_seqno: 1750795062295825) as {username='slave'} at 127.3.15.65:40247; Asking this server to re-register.
I20250624 19:57:48.440408  3363 heartbeater.cc:461] Registering TS with master...
I20250624 19:57:48.441222  3363 heartbeater.cc:507] Master 127.3.15.126:39801 requested a full tablet report, sending...
I20250624 19:57:48.444218  3541 ts_manager.cc:194] Registered new tserver with Master: c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65:37511)
I20250624 19:57:48.450382  3133 external_mini_cluster.cc:934] 1 TS(s) registered with all masters
I20250624 19:57:48.451124  3133 test_util.cc:276] Using random seed: -1105123050
I20250624 19:57:48.514503  3541 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:52568:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 1
split_rows_range_bounds {
  rows: "<redacted>""\004\001\000\377\377\377\037\004\001\000\376\377\377?\004\001\000\375\377\377_"
  indirect_data: "<redacted>"""
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
I20250624 19:57:48.580088  3298 tablet_service.cc:1468] Processing CreateTablet for tablet baea20be0c8b42508f2ad3a5e202d856 (DEFAULT_TABLE table=test-workload [id=158e4844f50e49a7903f56b3e1e3a13b]), partition=RANGE (key) PARTITION VALUES < 536870911
I20250624 19:57:48.580284  3295 tablet_service.cc:1468] Processing CreateTablet for tablet 5cee3955877944b4916a292f0063c92f (DEFAULT_TABLE table=test-workload [id=158e4844f50e49a7903f56b3e1e3a13b]), partition=RANGE (key) PARTITION 1610612733 <= VALUES
I20250624 19:57:48.580662  3296 tablet_service.cc:1468] Processing CreateTablet for tablet 72e6ce12b5de4513b71381caf90211ce (DEFAULT_TABLE table=test-workload [id=158e4844f50e49a7903f56b3e1e3a13b]), partition=RANGE (key) PARTITION 1073741822 <= VALUES < 1610612733
I20250624 19:57:48.580116  3297 tablet_service.cc:1468] Processing CreateTablet for tablet 1b4497262bd14982b89dc3803b858391 (DEFAULT_TABLE table=test-workload [id=158e4844f50e49a7903f56b3e1e3a13b]), partition=RANGE (key) PARTITION 536870911 <= VALUES < 1073741822
I20250624 19:57:48.584010  3295 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5cee3955877944b4916a292f0063c92f. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:48.585018  3297 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1b4497262bd14982b89dc3803b858391. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:48.585912  3296 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 72e6ce12b5de4513b71381caf90211ce. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:48.586607  3298 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet baea20be0c8b42508f2ad3a5e202d856. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:48.625624  3613 tablet_bootstrap.cc:492] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: Bootstrap starting.
I20250624 19:57:48.632604  3613 tablet_bootstrap.cc:654] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: Neither blocks nor log segments found. Creating new log.
I20250624 19:57:48.634654  3613 log.cc:826] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: Log is configured to *not* fsync() on all Append() calls
I20250624 19:57:48.640933  3613 tablet_bootstrap.cc:492] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: No bootstrap required, opened a new log
I20250624 19:57:48.641484  3613 ts_tablet_manager.cc:1397] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent bootstrapping tablet: real 0.017s	user 0.012s	sys 0.002s
I20250624 19:57:48.669590  3613 raft_consensus.cc:357] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.670616  3613 raft_consensus.cc:383] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:57:48.671090  3613 raft_consensus.cc:738] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Initialized, Role: FOLLOWER
I20250624 19:57:48.672268  3613 consensus_queue.cc:260] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.673198  3613 raft_consensus.cc:397] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:57:48.673820  3613 raft_consensus.cc:491] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:57:48.674274  3613 raft_consensus.cc:3058] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:57:48.682137  3613 raft_consensus.cc:513] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.683303  3613 leader_election.cc:304] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: c9ff5371a1a14662a1a2fec8dd43e81c; no voters: 
I20250624 19:57:48.687412  3613 leader_election.cc:290] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250624 19:57:48.687919  3615 raft_consensus.cc:2802] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:57:48.700963  3613 ts_tablet_manager.cc:1428] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent starting tablet: real 0.059s	user 0.051s	sys 0.008s
I20250624 19:57:48.701040  3615 raft_consensus.cc:695] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 LEADER]: Becoming Leader. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Running, Role: LEADER
I20250624 19:57:48.702234  3613 tablet_bootstrap.cc:492] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c: Bootstrap starting.
I20250624 19:57:48.702980  3615 consensus_queue.cc:237] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.709188  3613 tablet_bootstrap.cc:654] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c: Neither blocks nor log segments found. Creating new log.
I20250624 19:57:48.722167  3613 tablet_bootstrap.cc:492] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c: No bootstrap required, opened a new log
I20250624 19:57:48.722844  3613 ts_tablet_manager.cc:1397] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent bootstrapping tablet: real 0.021s	user 0.011s	sys 0.008s
I20250624 19:57:48.726222  3613 raft_consensus.cc:357] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.727064  3613 raft_consensus.cc:383] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:57:48.727443  3613 raft_consensus.cc:738] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Initialized, Role: FOLLOWER
I20250624 19:57:48.728346  3613 consensus_queue.cc:260] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.729270  3613 raft_consensus.cc:397] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:57:48.729642  3613 raft_consensus.cc:491] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:57:48.730093  3613 raft_consensus.cc:3058] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:57:48.737764  3613 raft_consensus.cc:513] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.738636  3613 leader_election.cc:304] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: c9ff5371a1a14662a1a2fec8dd43e81c; no voters: 
I20250624 19:57:48.739369  3613 leader_election.cc:290] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250624 19:57:48.739671  3616 raft_consensus.cc:2802] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:57:48.741201  3541 catalog_manager.cc:5582] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c reported cstate change: term changed from 0 to 1, leader changed from <none> to c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65). New cstate: current_term: 1 leader_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } health_report { overall_health: HEALTHY } } }
I20250624 19:57:48.745108  3616 raft_consensus.cc:695] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 LEADER]: Becoming Leader. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Running, Role: LEADER
I20250624 19:57:48.745952  3616 consensus_queue.cc:237] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.765666  3613 ts_tablet_manager.cc:1428] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent starting tablet: real 0.042s	user 0.017s	sys 0.007s
I20250624 19:57:48.766827  3613 tablet_bootstrap.cc:492] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c: Bootstrap starting.
I20250624 19:57:48.770445  3541 catalog_manager.cc:5582] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c reported cstate change: term changed from 0 to 1, leader changed from <none> to c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65). New cstate: current_term: 1 leader_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } health_report { overall_health: HEALTHY } } }
I20250624 19:57:48.773411  3613 tablet_bootstrap.cc:654] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c: Neither blocks nor log segments found. Creating new log.
I20250624 19:57:48.785677  3613 tablet_bootstrap.cc:492] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c: No bootstrap required, opened a new log
I20250624 19:57:48.786264  3613 ts_tablet_manager.cc:1397] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent bootstrapping tablet: real 0.020s	user 0.005s	sys 0.012s
I20250624 19:57:48.789059  3613 raft_consensus.cc:357] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.789920  3613 raft_consensus.cc:383] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:57:48.790516  3613 raft_consensus.cc:738] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Initialized, Role: FOLLOWER
I20250624 19:57:48.791610  3613 consensus_queue.cc:260] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.792393  3613 raft_consensus.cc:397] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:57:48.792819  3613 raft_consensus.cc:491] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:57:48.793239  3613 raft_consensus.cc:3058] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:57:48.800714  3613 raft_consensus.cc:513] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.801749  3613 leader_election.cc:304] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: c9ff5371a1a14662a1a2fec8dd43e81c; no voters: 
I20250624 19:57:48.802384  3613 leader_election.cc:290] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250624 19:57:48.802680  3617 raft_consensus.cc:2802] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:57:48.803782  3617 raft_consensus.cc:695] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 LEADER]: Becoming Leader. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Running, Role: LEADER
I20250624 19:57:48.804107  3613 ts_tablet_manager.cc:1428] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent starting tablet: real 0.017s	user 0.014s	sys 0.000s
I20250624 19:57:48.804555  3617 consensus_queue.cc:237] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.805055  3613 tablet_bootstrap.cc:492] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c: Bootstrap starting.
I20250624 19:57:48.811902  3613 tablet_bootstrap.cc:654] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c: Neither blocks nor log segments found. Creating new log.
I20250624 19:57:48.821250  3542 catalog_manager.cc:5582] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c reported cstate change: term changed from 0 to 1, leader changed from <none> to c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65). New cstate: current_term: 1 leader_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } health_report { overall_health: HEALTHY } } }
I20250624 19:57:48.822254  3613 tablet_bootstrap.cc:492] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c: No bootstrap required, opened a new log
I20250624 19:57:48.822647  3613 ts_tablet_manager.cc:1397] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent bootstrapping tablet: real 0.018s	user 0.011s	sys 0.004s
I20250624 19:57:48.824935  3613 raft_consensus.cc:357] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.825542  3613 raft_consensus.cc:383] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:57:48.825835  3613 raft_consensus.cc:738] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Initialized, Role: FOLLOWER
I20250624 19:57:48.826507  3613 consensus_queue.cc:260] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.827015  3613 raft_consensus.cc:397] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:57:48.827272  3613 raft_consensus.cc:491] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:57:48.827616  3613 raft_consensus.cc:3058] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:57:48.832736  3613 raft_consensus.cc:513] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.833571  3613 leader_election.cc:304] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: c9ff5371a1a14662a1a2fec8dd43e81c; no voters: 
I20250624 19:57:48.834228  3613 leader_election.cc:290] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250624 19:57:48.834393  3617 raft_consensus.cc:2802] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:57:48.834893  3617 raft_consensus.cc:695] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [term 1 LEADER]: Becoming Leader. State: Replica: c9ff5371a1a14662a1a2fec8dd43e81c, State: Running, Role: LEADER
I20250624 19:57:48.835584  3617 consensus_queue.cc:237] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:48.836388  3613 ts_tablet_manager.cc:1428] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c: Time spent starting tablet: real 0.013s	user 0.009s	sys 0.004s
I20250624 19:57:48.843865  3542 catalog_manager.cc:5582] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c reported cstate change: term changed from 0 to 1, leader changed from <none> to c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65). New cstate: current_term: 1 leader_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } health_report { overall_health: HEALTHY } } }
W20250624 19:57:49.640985  3630 meta_cache.cc:1261] Time spent looking up entry by key: real 0.058s	user 0.002s	sys 0.000s
W20250624 19:57:50.390799  3629 meta_cache.cc:1261] Time spent looking up entry by key: real 0.062s	user 0.004s	sys 0.000s
W20250624 19:57:53.721263  3631 meta_cache.cc:1261] Time spent looking up entry by key: real 0.073s	user 0.011s	sys 0.020s
W20250624 19:57:53.721251  3635 meta_cache.cc:1261] Time spent looking up entry by key: real 0.075s	user 0.000s	sys 0.029s
W20250624 19:57:53.728749  3632 meta_cache.cc:1261] Time spent looking up entry by key: real 0.077s	user 0.004s	sys 0.000s
W20250624 19:57:53.886125  3634 meta_cache.cc:1261] Time spent looking up entry by key: real 0.241s	user 0.001s	sys 0.004s
W20250624 19:57:53.721455  3628 meta_cache.cc:1261] Time spent looking up entry by key: real 0.073s	user 0.007s	sys 0.027s
I20250624 19:57:54.264325  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.3.15.66:35061
--local_ip_for_outbound_sockets=127.3.15.66
--tserver_master_addrs=127.3.15.126:39801
--webserver_port=42695
--webserver_interface=127.3.15.66
--builtin_ntp_servers=127.3.15.84:38311
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
W20250624 19:57:54.626937  3652 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:57:54.627534  3652 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:57:54.628085  3652 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:57:54.668661  3652 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:57:54.670105  3652 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.3.15.66
I20250624 19:57:54.711660  3652 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:38311
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.3.15.66:35061
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.3.15.66
--webserver_port=42695
--tserver_master_addrs=127.3.15.126:39801
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.3.15.66
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:57:54.713121  3652 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:57:54.714941  3652 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:57:54.734833  3659 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:57:54.735814  3661 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:54.738013  3652 server_base.cc:1048] running on GCE node
W20250624 19:57:54.735260  3658 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:57:55.966256  3652 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:57:55.969249  3652 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:57:55.970777  3652 hybrid_clock.cc:648] HybridClock initialized: now 1750795075970723 us; error 70 us; skew 500 ppm
I20250624 19:57:55.971712  3652 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:57:55.979725  3652 webserver.cc:469] Webserver started at http://127.3.15.66:42695/ using document root <none> and password file <none>
I20250624 19:57:55.980847  3652 fs_manager.cc:362] Metadata directory not provided
I20250624 19:57:55.981112  3652 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:57:55.990445  3652 fs_manager.cc:714] Time spent opening directory manager: real 0.006s	user 0.004s	sys 0.004s
I20250624 19:57:55.996282  3668 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:55.997932  3652 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.003s	sys 0.000s
I20250624 19:57:55.998440  3652 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
uuid: "86cdccf3123144cd946ced41749f1c43"
format_stamp: "Formatted at 2025-06-24 19:57:44 on dist-test-slave-0t1p"
I20250624 19:57:56.000663  3652 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:57:56.064754  3652 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:57:56.066416  3652 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:57:56.066932  3652 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:57:56.069660  3652 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 19:57:56.074487  3652 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 19:57:56.074744  3652 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:56.075078  3652 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 19:57:56.075241  3652 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:57:56.239866  3652 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.66:35061
I20250624 19:57:56.240563  3781 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.66:35061 every 8 connection(s)
I20250624 19:57:56.243289  3652 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
I20250624 19:57:56.250267  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3652
I20250624 19:57:56.274636  3782 heartbeater.cc:344] Connected to a master server at 127.3.15.126:39801
I20250624 19:57:56.275171  3782 heartbeater.cc:461] Registering TS with master...
I20250624 19:57:56.277547  3788 ts_tablet_manager.cc:927] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Initiating tablet copy from peer c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65:37511)
I20250624 19:57:56.277483  3782 heartbeater.cc:507] Master 127.3.15.126:39801 requested a full tablet report, sending...
I20250624 19:57:56.281589  3538 ts_manager.cc:194] Registered new tserver with Master: 86cdccf3123144cd946ced41749f1c43 (127.3.15.66:35061)
I20250624 19:57:56.282044  3788 tablet_copy_client.cc:323] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Beginning tablet copy session from remote peer at address 127.3.15.65:37511
I20250624 19:57:56.284137  3538 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.3.15.66:53875
I20250624 19:57:56.294242  3338 tablet_copy_service.cc:140] P c9ff5371a1a14662a1a2fec8dd43e81c: Received BeginTabletCopySession request for tablet 1b4497262bd14982b89dc3803b858391 from peer 86cdccf3123144cd946ced41749f1c43 ({username='slave'} at 127.3.15.66:40257)
I20250624 19:57:56.294891  3338 tablet_copy_service.cc:161] P c9ff5371a1a14662a1a2fec8dd43e81c: Beginning new tablet copy session on tablet 1b4497262bd14982b89dc3803b858391 from peer 86cdccf3123144cd946ced41749f1c43 at {username='slave'} at 127.3.15.66:40257: session id = 86cdccf3123144cd946ced41749f1c43-1b4497262bd14982b89dc3803b858391
I20250624 19:57:56.301446  3338 tablet_copy_source_session.cc:215] T 1b4497262bd14982b89dc3803b858391 P c9ff5371a1a14662a1a2fec8dd43e81c: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 19:57:56.307345  3788 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 1b4497262bd14982b89dc3803b858391. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:56.328584  3788 tablet_copy_client.cc:806] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 0 data blocks...
I20250624 19:57:56.329269  3788 tablet_copy_client.cc:670] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 1 WAL segments...
I20250624 19:57:56.346941  3788 tablet_copy_client.cc:538] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 19:57:56.356319  3788 tablet_bootstrap.cc:492] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Bootstrap starting.
I20250624 19:57:56.526511  3788 log.cc:826] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Log is configured to *not* fsync() on all Append() calls
I20250624 19:57:57.288897  3782 heartbeater.cc:499] Master 127.3.15.126:39801 was elected leader, sending a full tablet report...
I20250624 19:57:57.708468  3788 tablet_bootstrap.cc:492] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Bootstrap replayed 1/1 log segments. Stats: ops{read=216 overwritten=0 applied=216 ignored=0} inserts{seen=2650 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 19:57:57.709448  3788 tablet_bootstrap.cc:492] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Bootstrap complete.
I20250624 19:57:57.710273  3788 ts_tablet_manager.cc:1397] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Time spent bootstrapping tablet: real 1.354s	user 1.305s	sys 0.049s
I20250624 19:57:57.725703  3788 raft_consensus.cc:357] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:57.726642  3788 raft_consensus.cc:738] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 86cdccf3123144cd946ced41749f1c43, State: Initialized, Role: NON_PARTICIPANT
I20250624 19:57:57.727340  3788 consensus_queue.cc:260] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 216, Last appended: 1.216, Last appended by leader: 216, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:57.730808  3788 ts_tablet_manager.cc:1428] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Time spent starting tablet: real 0.020s	user 0.017s	sys 0.004s
I20250624 19:57:57.732560  3338 tablet_copy_service.cc:342] P c9ff5371a1a14662a1a2fec8dd43e81c: Request end of tablet copy session 86cdccf3123144cd946ced41749f1c43-1b4497262bd14982b89dc3803b858391 received from {username='slave'} at 127.3.15.66:40257
I20250624 19:57:57.733125  3338 tablet_copy_service.cc:434] P c9ff5371a1a14662a1a2fec8dd43e81c: ending tablet copy session 86cdccf3123144cd946ced41749f1c43-1b4497262bd14982b89dc3803b858391 on tablet 1b4497262bd14982b89dc3803b858391 with peer 86cdccf3123144cd946ced41749f1c43
I20250624 19:57:57.737805  3788 ts_tablet_manager.cc:927] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Initiating tablet copy from peer c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65:37511)
I20250624 19:57:57.739884  3788 tablet_copy_client.cc:323] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: tablet copy: Beginning tablet copy session from remote peer at address 127.3.15.65:37511
I20250624 19:57:57.741540  3338 tablet_copy_service.cc:140] P c9ff5371a1a14662a1a2fec8dd43e81c: Received BeginTabletCopySession request for tablet 5cee3955877944b4916a292f0063c92f from peer 86cdccf3123144cd946ced41749f1c43 ({username='slave'} at 127.3.15.66:40257)
I20250624 19:57:57.742041  3338 tablet_copy_service.cc:161] P c9ff5371a1a14662a1a2fec8dd43e81c: Beginning new tablet copy session on tablet 5cee3955877944b4916a292f0063c92f from peer 86cdccf3123144cd946ced41749f1c43 at {username='slave'} at 127.3.15.66:40257: session id = 86cdccf3123144cd946ced41749f1c43-5cee3955877944b4916a292f0063c92f
I20250624 19:57:57.747390  3338 tablet_copy_source_session.cc:215] T 5cee3955877944b4916a292f0063c92f P c9ff5371a1a14662a1a2fec8dd43e81c: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 19:57:57.749943  3788 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5cee3955877944b4916a292f0063c92f. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:57.759253  3788 tablet_copy_client.cc:806] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 0 data blocks...
I20250624 19:57:57.759698  3788 tablet_copy_client.cc:670] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 1 WAL segments...
I20250624 19:57:57.773665  3788 tablet_copy_client.cc:538] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 19:57:57.780102  3788 tablet_bootstrap.cc:492] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Bootstrap starting.
I20250624 19:57:59.110121  3788 tablet_bootstrap.cc:492] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Bootstrap replayed 1/1 log segments. Stats: ops{read=216 overwritten=0 applied=216 ignored=0} inserts{seen=2719 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 19:57:59.110881  3788 tablet_bootstrap.cc:492] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Bootstrap complete.
I20250624 19:57:59.111485  3788 ts_tablet_manager.cc:1397] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Time spent bootstrapping tablet: real 1.332s	user 1.259s	sys 0.072s
I20250624 19:57:59.113711  3788 raft_consensus.cc:357] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:59.114073  3788 raft_consensus.cc:738] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 86cdccf3123144cd946ced41749f1c43, State: Initialized, Role: NON_PARTICIPANT
I20250624 19:57:59.114595  3788 consensus_queue.cc:260] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 216, Last appended: 1.216, Last appended by leader: 216, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:57:59.117285  3788 ts_tablet_manager.cc:1428] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Time spent starting tablet: real 0.006s	user 0.008s	sys 0.000s
I20250624 19:57:59.118829  3338 tablet_copy_service.cc:342] P c9ff5371a1a14662a1a2fec8dd43e81c: Request end of tablet copy session 86cdccf3123144cd946ced41749f1c43-5cee3955877944b4916a292f0063c92f received from {username='slave'} at 127.3.15.66:40257
I20250624 19:57:59.119151  3338 tablet_copy_service.cc:434] P c9ff5371a1a14662a1a2fec8dd43e81c: ending tablet copy session 86cdccf3123144cd946ced41749f1c43-5cee3955877944b4916a292f0063c92f on tablet 5cee3955877944b4916a292f0063c92f with peer 86cdccf3123144cd946ced41749f1c43
W20250624 19:57:59.123071  3788 ts_tablet_manager.cc:726] T 5cee3955877944b4916a292f0063c92f P 86cdccf3123144cd946ced41749f1c43: Tablet Copy: Invalid argument: Leader has replica of tablet 5cee3955877944b4916a292f0063c92f with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250624 19:57:59.130204  3788 ts_tablet_manager.cc:927] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Initiating tablet copy from peer c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65:37511)
I20250624 19:57:59.131894  3788 tablet_copy_client.cc:323] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: tablet copy: Beginning tablet copy session from remote peer at address 127.3.15.65:37511
I20250624 19:57:59.133361  3338 tablet_copy_service.cc:140] P c9ff5371a1a14662a1a2fec8dd43e81c: Received BeginTabletCopySession request for tablet 72e6ce12b5de4513b71381caf90211ce from peer 86cdccf3123144cd946ced41749f1c43 ({username='slave'} at 127.3.15.66:40257)
I20250624 19:57:59.133767  3338 tablet_copy_service.cc:161] P c9ff5371a1a14662a1a2fec8dd43e81c: Beginning new tablet copy session on tablet 72e6ce12b5de4513b71381caf90211ce from peer 86cdccf3123144cd946ced41749f1c43 at {username='slave'} at 127.3.15.66:40257: session id = 86cdccf3123144cd946ced41749f1c43-72e6ce12b5de4513b71381caf90211ce
I20250624 19:57:59.138622  3338 tablet_copy_source_session.cc:215] T 72e6ce12b5de4513b71381caf90211ce P c9ff5371a1a14662a1a2fec8dd43e81c: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 19:57:59.141111  3788 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 72e6ce12b5de4513b71381caf90211ce. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:57:59.150281  3788 tablet_copy_client.cc:806] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 0 data blocks...
I20250624 19:57:59.150787  3788 tablet_copy_client.cc:670] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 1 WAL segments...
I20250624 19:57:59.163468  3788 tablet_copy_client.cc:538] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 19:57:59.202721  3788 tablet_bootstrap.cc:492] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Bootstrap starting.
I20250624 19:58:00.459039  3788 tablet_bootstrap.cc:492] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Bootstrap replayed 1/1 log segments. Stats: ops{read=216 overwritten=0 applied=216 ignored=0} inserts{seen=2791 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 19:58:00.459815  3788 tablet_bootstrap.cc:492] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Bootstrap complete.
I20250624 19:58:00.460439  3788 ts_tablet_manager.cc:1397] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Time spent bootstrapping tablet: real 1.258s	user 1.211s	sys 0.044s
I20250624 19:58:00.462415  3788 raft_consensus.cc:357] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:58:00.462779  3788 raft_consensus.cc:738] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 86cdccf3123144cd946ced41749f1c43, State: Initialized, Role: NON_PARTICIPANT
I20250624 19:58:00.463224  3788 consensus_queue.cc:260] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 216, Last appended: 1.216, Last appended by leader: 216, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:58:00.466871  3788 ts_tablet_manager.cc:1428] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Time spent starting tablet: real 0.006s	user 0.001s	sys 0.004s
I20250624 19:58:00.468533  3338 tablet_copy_service.cc:342] P c9ff5371a1a14662a1a2fec8dd43e81c: Request end of tablet copy session 86cdccf3123144cd946ced41749f1c43-72e6ce12b5de4513b71381caf90211ce received from {username='slave'} at 127.3.15.66:40257
I20250624 19:58:00.468971  3338 tablet_copy_service.cc:434] P c9ff5371a1a14662a1a2fec8dd43e81c: ending tablet copy session 86cdccf3123144cd946ced41749f1c43-72e6ce12b5de4513b71381caf90211ce on tablet 72e6ce12b5de4513b71381caf90211ce with peer 86cdccf3123144cd946ced41749f1c43
W20250624 19:58:00.472803  3788 ts_tablet_manager.cc:726] T 1b4497262bd14982b89dc3803b858391 P 86cdccf3123144cd946ced41749f1c43: Tablet Copy: Invalid argument: Leader has replica of tablet 1b4497262bd14982b89dc3803b858391 with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
W20250624 19:58:00.479526  3788 ts_tablet_manager.cc:726] T 72e6ce12b5de4513b71381caf90211ce P 86cdccf3123144cd946ced41749f1c43: Tablet Copy: Invalid argument: Leader has replica of tablet 72e6ce12b5de4513b71381caf90211ce with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250624 19:58:00.483575  3788 ts_tablet_manager.cc:927] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Initiating tablet copy from peer c9ff5371a1a14662a1a2fec8dd43e81c (127.3.15.65:37511)
I20250624 19:58:00.484838  3788 tablet_copy_client.cc:323] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Beginning tablet copy session from remote peer at address 127.3.15.65:37511
I20250624 19:58:00.486341  3338 tablet_copy_service.cc:140] P c9ff5371a1a14662a1a2fec8dd43e81c: Received BeginTabletCopySession request for tablet baea20be0c8b42508f2ad3a5e202d856 from peer 86cdccf3123144cd946ced41749f1c43 ({username='slave'} at 127.3.15.66:40257)
I20250624 19:58:00.486778  3338 tablet_copy_service.cc:161] P c9ff5371a1a14662a1a2fec8dd43e81c: Beginning new tablet copy session on tablet baea20be0c8b42508f2ad3a5e202d856 from peer 86cdccf3123144cd946ced41749f1c43 at {username='slave'} at 127.3.15.66:40257: session id = 86cdccf3123144cd946ced41749f1c43-baea20be0c8b42508f2ad3a5e202d856
I20250624 19:58:00.492053  3338 tablet_copy_source_session.cc:215] T baea20be0c8b42508f2ad3a5e202d856 P c9ff5371a1a14662a1a2fec8dd43e81c: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 19:58:00.494565  3788 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet baea20be0c8b42508f2ad3a5e202d856. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:58:00.503538  3788 tablet_copy_client.cc:806] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 0 data blocks...
I20250624 19:58:00.504026  3788 tablet_copy_client.cc:670] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Starting download of 1 WAL segments...
I20250624 19:58:00.517648  3788 tablet_copy_client.cc:538] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 19:58:00.619921  3788 tablet_bootstrap.cc:492] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Bootstrap starting.
I20250624 19:58:01.878065  3788 tablet_bootstrap.cc:492] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Bootstrap replayed 1/1 log segments. Stats: ops{read=216 overwritten=0 applied=216 ignored=0} inserts{seen=2590 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 19:58:01.878801  3788 tablet_bootstrap.cc:492] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Bootstrap complete.
I20250624 19:58:01.879302  3788 ts_tablet_manager.cc:1397] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Time spent bootstrapping tablet: real 1.260s	user 1.185s	sys 0.072s
I20250624 19:58:01.881605  3788 raft_consensus.cc:357] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:58:01.882145  3788 raft_consensus.cc:738] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 86cdccf3123144cd946ced41749f1c43, State: Initialized, Role: NON_PARTICIPANT
I20250624 19:58:01.883517  3788 consensus_queue.cc:260] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 216, Last appended: 1.216, Last appended by leader: 216, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "c9ff5371a1a14662a1a2fec8dd43e81c" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 37511 } }
I20250624 19:58:01.886073  3788 ts_tablet_manager.cc:1428] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Time spent starting tablet: real 0.007s	user 0.007s	sys 0.001s
I20250624 19:58:01.888397  3338 tablet_copy_service.cc:342] P c9ff5371a1a14662a1a2fec8dd43e81c: Request end of tablet copy session 86cdccf3123144cd946ced41749f1c43-baea20be0c8b42508f2ad3a5e202d856 received from {username='slave'} at 127.3.15.66:40257
I20250624 19:58:01.889487  3338 tablet_copy_service.cc:434] P c9ff5371a1a14662a1a2fec8dd43e81c: ending tablet copy session 86cdccf3123144cd946ced41749f1c43-baea20be0c8b42508f2ad3a5e202d856 on tablet baea20be0c8b42508f2ad3a5e202d856 with peer 86cdccf3123144cd946ced41749f1c43
W20250624 19:58:01.894555  3788 ts_tablet_manager.cc:726] T baea20be0c8b42508f2ad3a5e202d856 P 86cdccf3123144cd946ced41749f1c43: Tablet Copy: Invalid argument: Leader has replica of tablet baea20be0c8b42508f2ad3a5e202d856 with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250624 19:58:01.899914  3133 tablet_copy-itest.cc:1252] Number of Service unavailable responses: 1233
I20250624 19:58:01.900288  3133 tablet_copy-itest.cc:1253] Number of in progress responses: 888
I20250624 19:58:01.903522  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3234
I20250624 19:58:01.955505  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3652
I20250624 19:58:01.986606  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3507
2025-06-24T19:58:02Z chronyd exiting
[       OK ] TabletCopyITest.TestTabletCopyThrottling (23809 ms)
[ RUN      ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate
2025-06-24T19:58:02Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T19:58:02Z Disabled control of system clock
I20250624 19:58:02.086625  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.3.15.126:40161
--webserver_interface=127.3.15.126
--webserver_port=0
--builtin_ntp_servers=127.3.15.84:44549
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.3.15.126:40161 with env {}
W20250624 19:58:02.406839  3807 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:58:02.407526  3807 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:58:02.408004  3807 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:58:02.440261  3807 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 19:58:02.440675  3807 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:58:02.440953  3807 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 19:58:02.441207  3807 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 19:58:02.478084  3807 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:44549
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.3.15.126:40161
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.3.15.126:40161
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.3.15.126
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true

Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:58:02.479967  3807 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:58:02.482358  3807 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:58:02.499075  3814 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:58:02.499325  3816 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:58:02.499074  3813 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:02.500849  3807 server_base.cc:1048] running on GCE node
I20250624 19:58:03.718356  3807 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:58:03.721817  3807 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:58:03.723336  3807 hybrid_clock.cc:648] HybridClock initialized: now 1750795083723283 us; error 64 us; skew 500 ppm
I20250624 19:58:03.724231  3807 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:58:03.731695  3807 webserver.cc:469] Webserver started at http://127.3.15.126:37433/ using document root <none> and password file <none>
I20250624 19:58:03.732967  3807 fs_manager.cc:362] Metadata directory not provided
I20250624 19:58:03.733214  3807 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:58:03.733778  3807 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:58:03.738708  3807 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data/instance:
uuid: "126074fc73d84ddc8590f82344d48a18"
format_stamp: "Formatted at 2025-06-24 19:58:03 on dist-test-slave-0t1p"
I20250624 19:58:03.740161  3807 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal/instance:
uuid: "126074fc73d84ddc8590f82344d48a18"
format_stamp: "Formatted at 2025-06-24 19:58:03 on dist-test-slave-0t1p"
I20250624 19:58:03.748397  3807 fs_manager.cc:696] Time spent creating directory manager: real 0.007s	user 0.003s	sys 0.004s
I20250624 19:58:03.754837  3823 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:03.756057  3807 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.005s	sys 0.001s
I20250624 19:58:03.756453  3807 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal
uuid: "126074fc73d84ddc8590f82344d48a18"
format_stamp: "Formatted at 2025-06-24 19:58:03 on dist-test-slave-0t1p"
I20250624 19:58:03.756886  3807 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:58:03.807998  3807 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:58:03.809780  3807 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:58:03.810284  3807 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:58:03.888345  3807 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.126:40161
I20250624 19:58:03.888443  3874 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.126:40161 every 8 connection(s)
I20250624 19:58:03.891247  3807 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/data/info.pb
I20250624 19:58:03.894200  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3807
I20250624 19:58:03.894847  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/master-0/wal/instance
I20250624 19:58:03.897953  3875 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:58:03.924264  3875 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18: Bootstrap starting.
I20250624 19:58:03.930935  3875 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18: Neither blocks nor log segments found. Creating new log.
I20250624 19:58:03.933207  3875 log.cc:826] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18: Log is configured to *not* fsync() on all Append() calls
I20250624 19:58:03.938263  3875 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18: No bootstrap required, opened a new log
I20250624 19:58:03.957916  3875 raft_consensus.cc:357] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "126074fc73d84ddc8590f82344d48a18" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 40161 } }
I20250624 19:58:03.958890  3875 raft_consensus.cc:383] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:58:03.959219  3875 raft_consensus.cc:738] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 126074fc73d84ddc8590f82344d48a18, State: Initialized, Role: FOLLOWER
I20250624 19:58:03.960091  3875 consensus_queue.cc:260] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "126074fc73d84ddc8590f82344d48a18" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 40161 } }
I20250624 19:58:03.960829  3875 raft_consensus.cc:397] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 19:58:03.961128  3875 raft_consensus.cc:491] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 19:58:03.961417  3875 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:58:03.965698  3875 raft_consensus.cc:513] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "126074fc73d84ddc8590f82344d48a18" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 40161 } }
I20250624 19:58:03.966468  3875 leader_election.cc:304] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 126074fc73d84ddc8590f82344d48a18; no voters: 
I20250624 19:58:03.968134  3875 leader_election.cc:290] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [CANDIDATE]: Term 1 election: Requested vote from peers 
I20250624 19:58:03.968907  3880 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:58:03.972172  3880 raft_consensus.cc:695] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [term 1 LEADER]: Becoming Leader. State: Replica: 126074fc73d84ddc8590f82344d48a18, State: Running, Role: LEADER
I20250624 19:58:03.972940  3875 sys_catalog.cc:564] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 19:58:03.973086  3880 consensus_queue.cc:237] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "126074fc73d84ddc8590f82344d48a18" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 40161 } }
I20250624 19:58:03.985400  3881 sys_catalog.cc:455] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "126074fc73d84ddc8590f82344d48a18" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "126074fc73d84ddc8590f82344d48a18" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 40161 } } }
I20250624 19:58:03.985425  3882 sys_catalog.cc:455] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 126074fc73d84ddc8590f82344d48a18. Latest consensus state: current_term: 1 leader_uuid: "126074fc73d84ddc8590f82344d48a18" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "126074fc73d84ddc8590f82344d48a18" member_type: VOTER last_known_addr { host: "127.3.15.126" port: 40161 } } }
I20250624 19:58:03.986464  3881 sys_catalog.cc:458] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [sys.catalog]: This master's current role is: LEADER
I20250624 19:58:03.986557  3882 sys_catalog.cc:458] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18 [sys.catalog]: This master's current role is: LEADER
I20250624 19:58:03.991113  3889 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 19:58:04.006556  3889 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 19:58:04.027321  3889 catalog_manager.cc:1349] Generated new cluster ID: 50a087652f9d40f68fbc9e1ecdfb11f2
I20250624 19:58:04.027726  3889 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 19:58:04.049518  3889 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 19:58:04.051729  3889 catalog_manager.cc:1506] Loading token signing keys...
I20250624 19:58:04.072034  3889 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 126074fc73d84ddc8590f82344d48a18: Generated new TSK 0
I20250624 19:58:04.073280  3889 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 19:58:04.094774  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.3.15.65:0
--local_ip_for_outbound_sockets=127.3.15.65
--webserver_interface=127.3.15.65
--webserver_port=0
--tserver_master_addrs=127.3.15.126:40161
--builtin_ntp_servers=127.3.15.84:44549
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
W20250624 19:58:04.452688  3899 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:58:04.453258  3899 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:58:04.453734  3899 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250624 19:58:04.453986  3899 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250624 19:58:04.454334  3899 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:58:04.489583  3899 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:58:04.490554  3899 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.3.15.65
I20250624 19:58:04.530129  3899 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:44549
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.3.15.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.3.15.65
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.3.15.126:40161
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.3.15.65
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:58:04.531610  3899 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:58:04.533641  3899 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:58:04.553349  3906 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:58:04.554677  3905 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 19:58:04.558394  3908 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:04.557596  3899 server_base.cc:1048] running on GCE node
I20250624 19:58:05.779417  3899 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:58:05.782517  3899 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:58:05.784132  3899 hybrid_clock.cc:648] HybridClock initialized: now 1750795085784086 us; error 69 us; skew 500 ppm
I20250624 19:58:05.785072  3899 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:58:05.792979  3899 webserver.cc:469] Webserver started at http://127.3.15.65:46663/ using document root <none> and password file <none>
I20250624 19:58:05.794011  3899 fs_manager.cc:362] Metadata directory not provided
I20250624 19:58:05.794257  3899 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:58:05.794746  3899 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:58:05.799667  3899 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data/instance:
uuid: "9bd87754848d4ac7a22df1de055c1cef"
format_stamp: "Formatted at 2025-06-24 19:58:05 on dist-test-slave-0t1p"
I20250624 19:58:05.801046  3899 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal/instance:
uuid: "9bd87754848d4ac7a22df1de055c1cef"
format_stamp: "Formatted at 2025-06-24 19:58:05 on dist-test-slave-0t1p"
I20250624 19:58:05.809607  3899 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.007s	sys 0.001s
I20250624 19:58:05.816382  3915 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:05.817708  3899 fs_manager.cc:730] Time spent opening block manager: real 0.005s	user 0.003s	sys 0.001s
I20250624 19:58:05.818077  3899 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal
uuid: "9bd87754848d4ac7a22df1de055c1cef"
format_stamp: "Formatted at 2025-06-24 19:58:05 on dist-test-slave-0t1p"
I20250624 19:58:05.818436  3899 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:58:05.889976  3899 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:58:05.891534  3899 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:58:05.892113  3899 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:58:05.895023  3899 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 19:58:05.899976  3899 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 19:58:05.900228  3899 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:05.900488  3899 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 19:58:05.900688  3899 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.001s
I20250624 19:58:06.042057  3899 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.65:41589
I20250624 19:58:06.042178  4027 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.65:41589 every 8 connection(s)
I20250624 19:58:06.044893  3899 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/data/info.pb
I20250624 19:58:06.052255  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 3899
I20250624 19:58:06.052706  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-0/wal/instance
I20250624 19:58:06.059574  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.3.15.66:0
--local_ip_for_outbound_sockets=127.3.15.66
--webserver_interface=127.3.15.66
--webserver_port=0
--tserver_master_addrs=127.3.15.126:40161
--builtin_ntp_servers=127.3.15.84:44549
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
I20250624 19:58:06.069640  4028 heartbeater.cc:344] Connected to a master server at 127.3.15.126:40161
I20250624 19:58:06.070230  4028 heartbeater.cc:461] Registering TS with master...
I20250624 19:58:06.071713  4028 heartbeater.cc:507] Master 127.3.15.126:40161 requested a full tablet report, sending...
I20250624 19:58:06.075125  3840 ts_manager.cc:194] Registered new tserver with Master: 9bd87754848d4ac7a22df1de055c1cef (127.3.15.65:41589)
I20250624 19:58:06.077337  3840 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.3.15.65:55147
W20250624 19:58:06.380889  4032 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:58:06.381577  4032 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:58:06.381999  4032 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250624 19:58:06.382187  4032 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250624 19:58:06.382484  4032 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:58:06.415513  4032 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:58:06.416486  4032 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.3.15.66
I20250624 19:58:06.452637  4032 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:44549
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.3.15.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.3.15.66
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.3.15.126:40161
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.3.15.66
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:58:06.454070  4032 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:58:06.455866  4032 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:58:06.474836  4038 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:07.082271  4028 heartbeater.cc:499] Master 127.3.15.126:40161 was elected leader, sending a full tablet report...
W20250624 19:58:06.474946  4039 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:06.477880  4032 server_base.cc:1048] running on GCE node
W20250624 19:58:06.475009  4041 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:07.664496  4032 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:58:07.667508  4032 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:58:07.668993  4032 hybrid_clock.cc:648] HybridClock initialized: now 1750795087668944 us; error 63 us; skew 500 ppm
I20250624 19:58:07.669981  4032 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:58:07.677759  4032 webserver.cc:469] Webserver started at http://127.3.15.66:43259/ using document root <none> and password file <none>
I20250624 19:58:07.678802  4032 fs_manager.cc:362] Metadata directory not provided
I20250624 19:58:07.679030  4032 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:58:07.679464  4032 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:58:07.684166  4032 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data/instance:
uuid: "68b8a31f84ff465b97d2cb3c2fa3a021"
format_stamp: "Formatted at 2025-06-24 19:58:07 on dist-test-slave-0t1p"
I20250624 19:58:07.685437  4032 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal/instance:
uuid: "68b8a31f84ff465b97d2cb3c2fa3a021"
format_stamp: "Formatted at 2025-06-24 19:58:07 on dist-test-slave-0t1p"
I20250624 19:58:07.693554  4032 fs_manager.cc:696] Time spent creating directory manager: real 0.007s	user 0.006s	sys 0.001s
I20250624 19:58:07.699887  4048 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:07.701196  4032 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.002s	sys 0.001s
I20250624 19:58:07.701587  4032 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal
uuid: "68b8a31f84ff465b97d2cb3c2fa3a021"
format_stamp: "Formatted at 2025-06-24 19:58:07 on dist-test-slave-0t1p"
I20250624 19:58:07.701983  4032 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:58:07.755563  4032 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:58:07.757234  4032 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:58:07.757746  4032 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:58:07.760442  4032 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 19:58:07.764959  4032 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 19:58:07.765194  4032 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:07.765411  4032 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 19:58:07.765560  4032 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:07.919505  4032 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.66:44671
I20250624 19:58:07.919651  4160 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.66:44671 every 8 connection(s)
I20250624 19:58:07.922400  4032 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/data/info.pb
I20250624 19:58:07.933022  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 4032
I20250624 19:58:07.933564  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-1/wal/instance
I20250624 19:58:07.941352  3133 external_mini_cluster.cc:1351] Running /tmp/dist-test-task37W5hK/build/tsan/bin/kudu
/tmp/dist-test-task37W5hK/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.3.15.67:0
--local_ip_for_outbound_sockets=127.3.15.67
--webserver_interface=127.3.15.67
--webserver_port=0
--tserver_master_addrs=127.3.15.126:40161
--builtin_ntp_servers=127.3.15.84:44549
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
I20250624 19:58:07.948581  4161 heartbeater.cc:344] Connected to a master server at 127.3.15.126:40161
I20250624 19:58:07.949224  4161 heartbeater.cc:461] Registering TS with master...
I20250624 19:58:07.950507  4161 heartbeater.cc:507] Master 127.3.15.126:40161 requested a full tablet report, sending...
I20250624 19:58:07.953192  3840 ts_manager.cc:194] Registered new tserver with Master: 68b8a31f84ff465b97d2cb3c2fa3a021 (127.3.15.66:44671)
I20250624 19:58:07.954757  3840 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.3.15.66:56339
W20250624 19:58:08.263758  4165 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 19:58:08.264315  4165 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 19:58:08.264804  4165 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250624 19:58:08.265031  4165 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250624 19:58:08.265357  4165 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 19:58:08.299419  4165 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 19:58:08.300458  4165 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.3.15.67
I20250624 19:58:08.337857  4165 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.3.15.84:44549
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.3.15.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.3.15.67
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.3.15.126:40161
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.3.15.67
--log_dir=/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true

Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 19:43:22 UTC on 5fd53c4cbb9d
build id 6753
TSAN enabled
I20250624 19:58:08.339367  4165 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 19:58:08.341267  4165 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 19:58:08.359910  4172 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:08.958595  4161 heartbeater.cc:499] Master 127.3.15.126:40161 was elected leader, sending a full tablet report...
W20250624 19:58:08.361187  4173 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:08.362738  4165 server_base.cc:1048] running on GCE node
W20250624 19:58:08.363078  4175 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 19:58:09.541958  4165 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 19:58:09.544348  4165 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 19:58:09.545897  4165 hybrid_clock.cc:648] HybridClock initialized: now 1750795089545818 us; error 87 us; skew 500 ppm
I20250624 19:58:09.546845  4165 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 19:58:09.558730  4165 webserver.cc:469] Webserver started at http://127.3.15.67:43315/ using document root <none> and password file <none>
I20250624 19:58:09.559738  4165 fs_manager.cc:362] Metadata directory not provided
I20250624 19:58:09.559976  4165 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 19:58:09.560415  4165 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 19:58:09.565238  4165 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data/instance:
uuid: "9303e28334fc44629868721544e0dc96"
format_stamp: "Formatted at 2025-06-24 19:58:09 on dist-test-slave-0t1p"
I20250624 19:58:09.566565  4165 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal/instance:
uuid: "9303e28334fc44629868721544e0dc96"
format_stamp: "Formatted at 2025-06-24 19:58:09 on dist-test-slave-0t1p"
I20250624 19:58:09.574997  4165 fs_manager.cc:696] Time spent creating directory manager: real 0.008s	user 0.005s	sys 0.004s
I20250624 19:58:09.581344  4182 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:09.582659  4165 fs_manager.cc:730] Time spent opening block manager: real 0.004s	user 0.004s	sys 0.000s
I20250624 19:58:09.583060  4165 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data,/tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal
uuid: "9303e28334fc44629868721544e0dc96"
format_stamp: "Formatted at 2025-06-24 19:58:09 on dist-test-slave-0t1p"
I20250624 19:58:09.583452  4165 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 19:58:09.644809  4165 rpc_server.cc:225] running with OpenSSL 1.1.1  11 Sep 2018
I20250624 19:58:09.646411  4165 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 19:58:09.646924  4165 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 19:58:09.649682  4165 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 19:58:09.654152  4165 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 19:58:09.654438  4165 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:09.654713  4165 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 19:58:09.654881  4165 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s	user 0.000s	sys 0.000s
I20250624 19:58:09.799176  4165 rpc_server.cc:307] RPC server started. Bound to: 127.3.15.67:35785
I20250624 19:58:09.799293  4294 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.3.15.67:35785 every 8 connection(s)
I20250624 19:58:09.801967  4165 server_base.cc:1180] Dumped server information to /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/data/info.pb
I20250624 19:58:09.812577  3133 external_mini_cluster.cc:1413] Started /tmp/dist-test-task37W5hK/build/tsan/bin/kudu as pid 4165
I20250624 19:58:09.813238  3133 external_mini_cluster.cc:1427] Reading /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0/minicluster-data/ts-2/wal/instance
I20250624 19:58:09.824586  4295 heartbeater.cc:344] Connected to a master server at 127.3.15.126:40161
I20250624 19:58:09.825062  4295 heartbeater.cc:461] Registering TS with master...
I20250624 19:58:09.826144  4295 heartbeater.cc:507] Master 127.3.15.126:40161 requested a full tablet report, sending...
I20250624 19:58:09.828629  3840 ts_manager.cc:194] Registered new tserver with Master: 9303e28334fc44629868721544e0dc96 (127.3.15.67:35785)
I20250624 19:58:09.829996  3840 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.3.15.67:39755
I20250624 19:58:09.835115  3133 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 19:58:09.871579  3133 test_util.cc:276] Using random seed: -1083702587
I20250624 19:58:09.917451  3840 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:45178:
name: "test-workload"
schema {
  columns {
    name: "key"
    type: INT32
    is_key: true
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "int_val"
    type: INT32
    is_key: false
    is_nullable: false
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
  columns {
    name: "string_val"
    type: STRING
    is_key: false
    is_nullable: true
    encoding: AUTO_ENCODING
    compression: DEFAULT_COMPRESSION
    cfile_block_size: 0
    immutable: false
  }
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
  range_schema {
    columns {
      name: "key"
    }
  }
}
W20250624 19:58:09.920395  3840 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 19:58:09.987090  4230 tablet_service.cc:1468] Processing CreateTablet for tablet 4c32bc3832404ac68aa4aac69ef1184d (DEFAULT_TABLE table=test-workload [id=bae97069a962459c95ffe390ca1e4d90]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 19:58:09.989262  4230 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 4c32bc3832404ac68aa4aac69ef1184d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:58:09.992204  3963 tablet_service.cc:1468] Processing CreateTablet for tablet 4c32bc3832404ac68aa4aac69ef1184d (DEFAULT_TABLE table=test-workload [id=bae97069a962459c95ffe390ca1e4d90]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 19:58:09.992698  4096 tablet_service.cc:1468] Processing CreateTablet for tablet 4c32bc3832404ac68aa4aac69ef1184d (DEFAULT_TABLE table=test-workload [id=bae97069a962459c95ffe390ca1e4d90]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 19:58:09.994035  3963 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 4c32bc3832404ac68aa4aac69ef1184d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:58:09.994683  4096 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 4c32bc3832404ac68aa4aac69ef1184d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 19:58:10.019258  4319 tablet_bootstrap.cc:492] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: Bootstrap starting.
I20250624 19:58:10.021917  4320 tablet_bootstrap.cc:492] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: Bootstrap starting.
I20250624 19:58:10.025782  4321 tablet_bootstrap.cc:492] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: Bootstrap starting.
I20250624 19:58:10.028724  4319 tablet_bootstrap.cc:654] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: Neither blocks nor log segments found. Creating new log.
I20250624 19:58:10.030395  4320 tablet_bootstrap.cc:654] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: Neither blocks nor log segments found. Creating new log.
I20250624 19:58:10.031383  4319 log.cc:826] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: Log is configured to *not* fsync() on all Append() calls
I20250624 19:58:10.031881  4321 tablet_bootstrap.cc:654] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: Neither blocks nor log segments found. Creating new log.
I20250624 19:58:10.033008  4320 log.cc:826] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: Log is configured to *not* fsync() on all Append() calls
I20250624 19:58:10.034266  4321 log.cc:826] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: Log is configured to *not* fsync() on all Append() calls
I20250624 19:58:10.040230  4319 tablet_bootstrap.cc:492] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: No bootstrap required, opened a new log
I20250624 19:58:10.040371  4321 tablet_bootstrap.cc:492] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: No bootstrap required, opened a new log
I20250624 19:58:10.040853  4320 tablet_bootstrap.cc:492] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: No bootstrap required, opened a new log
I20250624 19:58:10.040998  4321 ts_tablet_manager.cc:1397] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: Time spent bootstrapping tablet: real 0.016s	user 0.011s	sys 0.003s
I20250624 19:58:10.041038  4319 ts_tablet_manager.cc:1397] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: Time spent bootstrapping tablet: real 0.022s	user 0.009s	sys 0.011s
I20250624 19:58:10.041365  4320 ts_tablet_manager.cc:1397] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: Time spent bootstrapping tablet: real 0.020s	user 0.009s	sys 0.009s
I20250624 19:58:10.069446  4321 raft_consensus.cc:357] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.070688  4321 raft_consensus.cc:383] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:58:10.071074  4321 raft_consensus.cc:738] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 68b8a31f84ff465b97d2cb3c2fa3a021, State: Initialized, Role: FOLLOWER
I20250624 19:58:10.070499  4320 raft_consensus.cc:357] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.071009  4319 raft_consensus.cc:357] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.071717  4320 raft_consensus.cc:383] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:58:10.072058  4319 raft_consensus.cc:383] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 19:58:10.072140  4320 raft_consensus.cc:738] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9303e28334fc44629868721544e0dc96, State: Initialized, Role: FOLLOWER
I20250624 19:58:10.072434  4319 raft_consensus.cc:738] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9bd87754848d4ac7a22df1de055c1cef, State: Initialized, Role: FOLLOWER
I20250624 19:58:10.072309  4321 consensus_queue.cc:260] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.073186  4320 consensus_queue.cc:260] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.073743  4319 consensus_queue.cc:260] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.081816  4319 ts_tablet_manager.cc:1428] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: Time spent starting tablet: real 0.040s	user 0.032s	sys 0.005s
I20250624 19:58:10.082235  4295 heartbeater.cc:499] Master 127.3.15.126:40161 was elected leader, sending a full tablet report...
I20250624 19:58:10.086921  4320 ts_tablet_manager.cc:1428] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: Time spent starting tablet: real 0.045s	user 0.037s	sys 0.002s
I20250624 19:58:10.089296  4321 ts_tablet_manager.cc:1428] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: Time spent starting tablet: real 0.048s	user 0.037s	sys 0.004s
W20250624 19:58:10.180018  4162 tablet.cc:2378] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 19:58:10.220768  4325 raft_consensus.cc:491] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 19:58:10.221325  4325 raft_consensus.cc:513] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.223907  4325 leader_election.cc:290] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 9303e28334fc44629868721544e0dc96 (127.3.15.67:35785), 9bd87754848d4ac7a22df1de055c1cef (127.3.15.65:41589)
I20250624 19:58:10.236351  4250 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d" candidate_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9303e28334fc44629868721544e0dc96" is_pre_election: true
I20250624 19:58:10.236371  3983 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d" candidate_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9bd87754848d4ac7a22df1de055c1cef" is_pre_election: true
I20250624 19:58:10.237198  4250 raft_consensus.cc:2466] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 68b8a31f84ff465b97d2cb3c2fa3a021 in term 0.
I20250624 19:58:10.237198  3983 raft_consensus.cc:2466] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 68b8a31f84ff465b97d2cb3c2fa3a021 in term 0.
I20250624 19:58:10.238488  4051 leader_election.cc:304] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 68b8a31f84ff465b97d2cb3c2fa3a021, 9303e28334fc44629868721544e0dc96; no voters: 
I20250624 19:58:10.239279  4325 raft_consensus.cc:2802] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 19:58:10.239646  4325 raft_consensus.cc:491] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 19:58:10.239918  4325 raft_consensus.cc:3058] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:58:10.244455  4325 raft_consensus.cc:513] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.246093  4325 leader_election.cc:290] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [CANDIDATE]: Term 1 election: Requested vote from peers 9303e28334fc44629868721544e0dc96 (127.3.15.67:35785), 9bd87754848d4ac7a22df1de055c1cef (127.3.15.65:41589)
I20250624 19:58:10.247079  4250 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d" candidate_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9303e28334fc44629868721544e0dc96"
I20250624 19:58:10.247187  3983 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d" candidate_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9bd87754848d4ac7a22df1de055c1cef"
I20250624 19:58:10.247572  4250 raft_consensus.cc:3058] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:58:10.247680  3983 raft_consensus.cc:3058] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 0 FOLLOWER]: Advancing to term 1
I20250624 19:58:10.252709  4250 raft_consensus.cc:2466] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 68b8a31f84ff465b97d2cb3c2fa3a021 in term 1.
I20250624 19:58:10.252702  3983 raft_consensus.cc:2466] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 68b8a31f84ff465b97d2cb3c2fa3a021 in term 1.
I20250624 19:58:10.253885  4052 leader_election.cc:304] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 68b8a31f84ff465b97d2cb3c2fa3a021, 9bd87754848d4ac7a22df1de055c1cef; no voters: 
I20250624 19:58:10.254650  4325 raft_consensus.cc:2802] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 19:58:10.256258  4325 raft_consensus.cc:695] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 1 LEADER]: Becoming Leader. State: Replica: 68b8a31f84ff465b97d2cb3c2fa3a021, State: Running, Role: LEADER
I20250624 19:58:10.257201  4325 consensus_queue.cc:237] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.266520  3840 catalog_manager.cc:5582] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 reported cstate change: term changed from 0 to 1, leader changed from <none> to 68b8a31f84ff465b97d2cb3c2fa3a021 (127.3.15.66). New cstate: current_term: 1 leader_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } health_report { overall_health: UNKNOWN } } }
W20250624 19:58:10.305004  4029 tablet.cc:2378] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 19:58:10.307170  4296 tablet.cc:2378] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 19:58:10.307801  4250 raft_consensus.cc:1273] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 1 FOLLOWER]: Refusing update from remote peer 68b8a31f84ff465b97d2cb3c2fa3a021: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 19:58:10.308168  3983 raft_consensus.cc:1273] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 1 FOLLOWER]: Refusing update from remote peer 68b8a31f84ff465b97d2cb3c2fa3a021: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 19:58:10.309868  4325 consensus_queue.cc:1035] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [LEADER]: Connected to new peer: Peer: permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 19:58:10.310745  4330 consensus_queue.cc:1035] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [LEADER]: Connected to new peer: Peer: permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 19:58:10.395502  3983 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d"
dest_uuid: "9bd87754848d4ac7a22df1de055c1cef"
 from {username='slave'} at 127.0.0.1:52498
I20250624 19:58:10.396123  3983 raft_consensus.cc:491] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20250624 19:58:10.396451  3983 raft_consensus.cc:3058] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 1 FOLLOWER]: Advancing to term 2
I20250624 19:58:10.400925  3983 raft_consensus.cc:513] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.403218  3983 leader_election.cc:290] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [CANDIDATE]: Term 2 election: Requested vote from peers 9303e28334fc44629868721544e0dc96 (127.3.15.67:35785), 68b8a31f84ff465b97d2cb3c2fa3a021 (127.3.15.66:44671)
I20250624 19:58:10.422530  4250 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d" candidate_uuid: "9bd87754848d4ac7a22df1de055c1cef" candidate_term: 2 candidate_status { last_received { term: 1 index: 2 } } ignore_live_leader: true dest_uuid: "9303e28334fc44629868721544e0dc96"
I20250624 19:58:10.422689  4116 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "4c32bc3832404ac68aa4aac69ef1184d" candidate_uuid: "9bd87754848d4ac7a22df1de055c1cef" candidate_term: 2 candidate_status { last_received { term: 1 index: 2 } } ignore_live_leader: true dest_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021"
I20250624 19:58:10.423321  4250 raft_consensus.cc:3058] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 1 FOLLOWER]: Advancing to term 2
I20250624 19:58:10.423506  4116 raft_consensus.cc:3053] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 1 LEADER]: Stepping down as leader of term 1
I20250624 19:58:10.423869  4116 raft_consensus.cc:738] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 68b8a31f84ff465b97d2cb3c2fa3a021, State: Running, Role: LEADER
I20250624 19:58:10.424753  4116 consensus_queue.cc:260] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 1.2, Last appended by leader: 2, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.426225  4116 raft_consensus.cc:3058] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 1 FOLLOWER]: Advancing to term 2
I20250624 19:58:10.432916  4250 raft_consensus.cc:2466] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9bd87754848d4ac7a22df1de055c1cef in term 2.
I20250624 19:58:10.434602  3918 leader_election.cc:304] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 9303e28334fc44629868721544e0dc96, 9bd87754848d4ac7a22df1de055c1cef; no voters: 
I20250624 19:58:10.435788  4116 raft_consensus.cc:2466] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9bd87754848d4ac7a22df1de055c1cef in term 2.
I20250624 19:58:10.435986  4327 raft_consensus.cc:2802] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 2 FOLLOWER]: Leader election won for term 2
I20250624 19:58:10.439405  4327 raft_consensus.cc:695] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [term 2 LEADER]: Becoming Leader. State: Replica: 9bd87754848d4ac7a22df1de055c1cef, State: Running, Role: LEADER
I20250624 19:58:10.440690  4327 consensus_queue.cc:237] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 1.2, Last appended by leader: 2, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } }
I20250624 19:58:10.448961  3839 catalog_manager.cc:5582] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef reported cstate change: term changed from 1 to 2, leader changed from 68b8a31f84ff465b97d2cb3c2fa3a021 (127.3.15.66) to 9bd87754848d4ac7a22df1de055c1cef (127.3.15.65). New cstate: current_term: 2 leader_uuid: "9bd87754848d4ac7a22df1de055c1cef" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9bd87754848d4ac7a22df1de055c1cef" member_type: VOTER last_known_addr { host: "127.3.15.65" port: 41589 } health_report { overall_health: HEALTHY } } }
W20250624 19:58:10.477523  4076 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:50566: Illegal state: replica 68b8a31f84ff465b97d2cb3c2fa3a021 is not leader of this config: current role FOLLOWER
W20250624 19:58:10.499521  4210 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:53416: Illegal state: replica 9303e28334fc44629868721544e0dc96 is not leader of this config: current role FOLLOWER
W20250624 19:58:10.499503  4209 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:53416: Illegal state: replica 9303e28334fc44629868721544e0dc96 is not leader of this config: current role FOLLOWER
I20250624 19:58:10.534920  4250 raft_consensus.cc:1273] T 4c32bc3832404ac68aa4aac69ef1184d P 9303e28334fc44629868721544e0dc96 [term 2 FOLLOWER]: Refusing update from remote peer 9bd87754848d4ac7a22df1de055c1cef: Log matching property violated. Preceding OpId in replica: term: 1 index: 2. Preceding OpId from leader: term: 2 index: 4. (index mismatch)
I20250624 19:58:10.535586  4116 raft_consensus.cc:1273] T 4c32bc3832404ac68aa4aac69ef1184d P 68b8a31f84ff465b97d2cb3c2fa3a021 [term 2 FOLLOWER]: Refusing update from remote peer 9bd87754848d4ac7a22df1de055c1cef: Log matching property violated. Preceding OpId in replica: term: 1 index: 2. Preceding OpId from leader: term: 2 index: 4. (index mismatch)
I20250624 19:58:10.536152  4327 consensus_queue.cc:1035] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [LEADER]: Connected to new peer: Peer: permanent_uuid: "9303e28334fc44629868721544e0dc96" member_type: VOTER last_known_addr { host: "127.3.15.67" port: 35785 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 3, Last known committed idx: 2, Time since last communication: 0.000s
I20250624 19:58:10.537585  4351 consensus_queue.cc:1035] T 4c32bc3832404ac68aa4aac69ef1184d P 9bd87754848d4ac7a22df1de055c1cef [LEADER]: Connected to new peer: Peer: permanent_uuid: "68b8a31f84ff465b97d2cb3c2fa3a021" member_type: VOTER last_known_addr { host: "127.3.15.66" port: 44671 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 3, Last known committed idx: 2, Time since last communication: 0.001s
I20250624 19:58:10.573772  4334 mvcc.cc:204] Tried to move back new op lower bound from 7171256690816774144 to 7171256690457812992. Current Snapshot: MvccSnapshot[applied={T|T < 7171256690816774144}]
I20250624 19:58:10.743932  4335 mvcc.cc:204] Tried to move back new op lower bound from 7171256691529371648 to 7171256690457812992. Current Snapshot: MvccSnapshot[applied={T|T < 7171256691529371648 or (T in {7171256691529371648})}]
W20250624 19:58:29.674793  4024 debug-util.cc:398] Leaking SignalData structure 0x7b0800041400 after lost signal to thread 3900
W20250624 19:58:29.676136  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000b94c0 after lost signal to thread 4027
W20250624 19:58:37.395730  4291 debug-util.cc:398] Leaking SignalData structure 0x7b08000bb5e0 after lost signal to thread 4166
W20250624 19:58:37.396970  4291 debug-util.cc:398] Leaking SignalData structure 0x7b08000cd380 after lost signal to thread 4294
W20250624 19:58:39.248760  4352 meta_cache.cc:1261] Time spent looking up entry by key: real 0.060s	user 0.004s	sys 0.000s
W20250624 19:58:43.604602  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000ce2c0 after lost signal to thread 3900
W20250624 19:58:43.605729  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000b2940 after lost signal to thread 4027
W20250624 19:58:53.046633  3918 outbound_call.cc:321] RPC callback for RPC call kudu.consensus.ConsensusService.UpdateConsensus -> {remote=127.3.15.67:35785, user_credentials={real_user=slave}} blocked reactor thread for 48401.8us
W20250624 19:59:03.039881  4344 meta_cache.cc:1261] Time spent looking up entry by key: real 0.087s	user 0.003s	sys 0.000s
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:2151: Failure
Failed
Bad status: Timed out: Timed out waiting for number of WAL segments on tablet 4c32bc3832404ac68aa4aac69ef1184d on TS 0 to be 6. Found 5
I20250624 19:59:10.674275  3133 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20250624 19:59:10.674808  3133 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 9bd87754848d4ac7a22df1de055c1cef and pid 3899
************************ BEGIN STACKS **************************
W20250624 19:59:11.495973  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000c2b20 after lost signal to thread 3900
W20250624 19:59:11.497035  4024 debug-util.cc:398] Leaking SignalData structure 0x7b08000b2b00 after lost signal to thread 4027
[New LWP 3900]
[New LWP 3901]
[New LWP 3902]
[New LWP 3903]
[New LWP 3904]
[New LWP 3911]
[New LWP 3912]
[New LWP 3913]
[New LWP 3916]
[New LWP 3917]
[New LWP 3918]
[New LWP 3919]
[New LWP 3920]
[New LWP 3921]
[New LWP 3922]
[New LWP 3923]
[New LWP 3924]
[New LWP 3925]
[New LWP 3926]
[New LWP 3927]
[New LWP 3928]
[New LWP 3929]
[New LWP 3930]
[New LWP 3931]
[New LWP 3932]
[New LWP 3933]
[New LWP 3934]
[New LWP 3935]
[New LWP 3936]
[New LWP 3937]
[New LWP 3938]
[New LWP 3939]
[New LWP 3940]
[New LWP 3941]
[New LWP 3942]
[New LWP 3943]
[New LWP 3944]
[New LWP 3945]
[New LWP 3946]
[New LWP 3947]
[New LWP 3948]
[New LWP 3949]
[New LWP 3950]
[New LWP 3951]
[New LWP 3952]
[New LWP 3953]
[New LWP 3954]
[New LWP 3955]
[New LWP 3956]
[New LWP 3957]
[New LWP 3958]
[New LWP 3959]
[New LWP 3960]
[New LWP 3961]
[New LWP 3962]
[New LWP 3963]
[New LWP 3964]
[New LWP 3965]
[New LWP 3966]
[New LWP 3967]
[New LWP 3968]
[New LWP 3969]
[New LWP 3970]
[New LWP 3971]
[New LWP 3972]
[New LWP 3973]
[New LWP 3974]
[New LWP 3975]
[New LWP 3976]
[New LWP 3977]
[New LWP 3978]
[New LWP 3979]
[New LWP 3980]
[New LWP 3981]
[New LWP 3982]
[New LWP 3983]
[New LWP 3984]
[New LWP 3985]
[New LWP 3986]
[New LWP 3987]
[New LWP 3988]
[New LWP 3989]
[New LWP 3990]
[New LWP 3991]
[New LWP 3992]
[New LWP 3993]
[New LWP 3994]
[New LWP 3995]
[New LWP 3996]
[New LWP 3997]
[New LWP 3998]
[New LWP 3999]
[New LWP 4000]
[New LWP 4001]
[New LWP 4002]
[New LWP 4003]
[New LWP 4004]
[New LWP 4005]
[New LWP 4006]
[New LWP 4007]
[New LWP 4008]
[New LWP 4009]
[New LWP 4010]
[New LWP 4011]
[New LWP 4012]
[New LWP 4013]
[New LWP 4014]
[New LWP 4015]
[New LWP 4016]
[New LWP 4017]
[New LWP 4018]
[New LWP 4019]
[New LWP 4020]
[New LWP 4021]
[New LWP 4022]
[New LWP 4023]
[New LWP 4024]
[New LWP 4025]
[New LWP 4026]
[New LWP 4027]
[New LWP 4028]
[New LWP 4029]
[New LWP 4337]
[New LWP 4519]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007fee48c8fd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 3899 "kudu"   0x00007fee48c8fd50 in ?? ()
  2    LWP 3900 "kudu"   0x00007fee440537a0 in ?? ()
  3    LWP 3901 "kudu"   0x00007fee48c8bfb9 in ?? ()
  4    LWP 3902 "kudu"   0x00007fee48c8bfb9 in ?? ()
  5    LWP 3903 "kudu"   0x00007fee48c8bfb9 in ?? ()
  6    LWP 3904 "kernel-watcher-" 0x00007fee48c8bfb9 in ?? ()
  7    LWP 3911 "ntp client-3911" 0x00007fee48c8f9e2 in ?? ()
  8    LWP 3912 "file cache-evic" 0x00007fee48c8bfb9 in ?? ()
  9    LWP 3913 "sq_acceptor" 0x00007fee44083cb9 in ?? ()
  10   LWP 3916 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  11   LWP 3917 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  12   LWP 3918 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  13   LWP 3919 "rpc reactor-391" 0x00007fee44090a47 in ?? ()
  14   LWP 3920 "MaintenanceMgr " 0x00007fee48c8bad3 in ?? ()
  15   LWP 3921 "txn-status-mana" 0x00007fee48c8bfb9 in ?? ()
  16   LWP 3922 "collect_and_rem" 0x00007fee48c8bfb9 in ?? ()
  17   LWP 3923 "tc-session-exp-" 0x00007fee48c8bfb9 in ?? ()
  18   LWP 3924 "rpc worker-3924" 0x00007fee48c8bad3 in ?? ()
  19   LWP 3925 "rpc worker-3925" 0x00007fee48c8bad3 in ?? ()
  20   LWP 3926 "rpc worker-3926" 0x00007fee48c8bad3 in ?? ()
  21   LWP 3927 "rpc worker-3927" 0x00007fee48c8bad3 in ?? ()
  22   LWP 3928 "rpc worker-3928" 0x00007fee48c8bad3 in ?? ()
  23   LWP 3929 "rpc worker-3929" 0x00007fee48c8bad3 in ?? ()
  24   LWP 3930 "rpc worker-3930" 0x00007fee48c8bad3 in ?? ()
  25   LWP 3931 "rpc worker-3931" 0x00007fee48c8bad3 in ?? ()
  26   LWP 3932 "rpc worker-3932" 0x00007fee48c8bad3 in ?? ()
  27   LWP 3933 "rpc worker-3933" 0x00007fee48c8bad3 in ?? ()
  28   LWP 3934 "rpc worker-3934" 0x00007fee48c8bad3 in ?? ()
  29   LWP 3935 "rpc worker-3935" 0x00007fee48c8bad3 in ?? ()
  30   LWP 3936 "rpc worker-3936" 0x00007fee48c8bad3 in ?? ()
  31   LWP 3937 "rpc worker-3937" 0x00007fee48c8bad3 in ?? ()
  32   LWP 3938 "rpc worker-3938" 0x00007fee48c8bad3 in ?? ()
  33   LWP 3939 "rpc worker-3939" 0x00007fee48c8bad3 in ?? ()
  34   LWP 3940 "rpc worker-3940" 0x00007fee48c8bad3 in ?? ()
  35   LWP 3941 "rpc worker-3941" 0x00007fee48c8bad3 in ?? ()
  36   LWP 3942 "rpc worker-3942" 0x00007fee48c8bad3 in ?? ()
  37   LWP 3943 "rpc worker-3943" 0x00007fee48c8bad3 in ?? ()
  38   LWP 3944 "rpc worker-3944" 0x00007fee48c8bad3 in ?? ()
  39   LWP 3945 "rpc worker-3945" 0x00007fee48c8bad3 in ?? ()
  40   LWP 3946 "rpc worker-3946" 0x00007fee48c8bad3 in ?? ()
  41   LWP 3947 "rpc worker-3947" 0x00007fee48c8bad3 in ?? ()
  42   LWP 3948 "rpc worker-3948" 0x00007fee48c8bad3 in ?? ()
  43   LWP 3949 "rpc worker-3949" 0x00007fee48c8bad3 in ?? ()
  44   LWP 3950 "rpc worker-3950" 0x00007fee48c8bad3 in ?? ()
  45   LWP 3951 "rpc worker-3951" 0x00007fee48c8bad3 in ?? ()
  46   LWP 3952 "rpc worker-3952" 0x00007fee48c8bad3 in ?? ()
  47   LWP 3953 "rpc worker-3953" 0x00007fee48c8bad3 in ?? ()
  48   LWP 3954 "rpc worker-3954" 0x00007fee48c8bad3 in ?? ()
  49   LWP 3955 "rpc worker-3955" 0x00007fee48c8bad3 in ?? ()
  50   LWP 3956 "rpc worker-3956" 0x00007fee48c8bad3 in ?? ()
  51   LWP 3957 "rpc worker-3957" 0x00007fee48c8bad3 in ?? ()
  52   LWP 3958 "rpc worker-3958" 0x00007fee48c8bad3 in ?? ()
  53   LWP 3959 "rpc worker-3959" 0x00007fee48c8bad3 in ?? ()
  54   LWP 3960 "rpc worker-3960" 0x00007fee48c8bad3 in ?? ()
  55   LWP 3961 "rpc worker-3961" 0x00007fee48c8bad3 in ?? ()
  56   LWP 3962 "rpc worker-3962" 0x00007fee48c8bad3 in ?? ()
  57   LWP 3963 "rpc worker-3963" 0x00007fee48c8bad3 in ?? ()
  58   LWP 3964 "rpc worker-3964" 0x00007fee48c8bad3 in ?? ()
  59   LWP 3965 "rpc worker-3965" 0x00007fee48c8bad3 in ?? ()
  60   LWP 3966 "rpc worker-3966" 0x00007fee48c8bad3 in ?? ()
  61   LWP 3967 "rpc worker-3967" 0x00007fee48c8bad3 in ?? ()
  62   LWP 3968 "rpc worker-3968" 0x00007fee48c8bad3 in ?? ()
  63   LWP 3969 "rpc worker-3969" 0x00007fee48c8bad3 in ?? ()
  64   LWP 3970 "rpc worker-3970" 0x00007fee48c8bad3 in ?? ()
  65   LWP 3971 "rpc worker-3971" 0x00007fee48c8bad3 in ?? ()
  66   LWP 3972 "rpc worker-3972" 0x00007fee48c8bad3 in ?? ()
  67   LWP 3973 "rpc worker-3973" 0x00007fee48c8bad3 in ?? ()
  68   LWP 3974 "rpc worker-3974" 0x00007fee48c8bad3 in ?? ()
  69   LWP 3975 "rpc worker-3975" 0x00007fee48c8bad3 in ?? ()
  70   LWP 3976 "rpc worker-3976" 0x00007fee48c8bad3 in ?? ()
  71   LWP 3977 "rpc worker-3977" 0x00007fee48c8bad3 in ?? ()
  72   LWP 3978 "rpc worker-3978" 0x00007fee48c8bad3 in ?? ()
  73   LWP 3979 "rpc worker-3979" 0x00007fee48c8bad3 in ?? ()
  74   LWP 3980 "rpc worker-3980" 0x00007fee48c8bad3 in ?? ()
  75   LWP 3981 "rpc worker-3981" 0x00007fee48c8bad3 in ?? ()
  76   LWP 3982 "rpc worker-3982" 0x00007fee48c8bad3 in ?? ()
  77   LWP 3983 "rpc worker-3983" 0x00007fee48c8bad3 in ?? ()
  78   LWP 3984 "rpc worker-3984" 0x00007fee48c8bad3 in ?? ()
  79   LWP 3985 "rpc worker-3985" 0x00007fee48c8bad3 in ?? ()
  80   LWP 3986 "rpc worker-3986" 0x00007fee48c8bad3 in ?? ()
  81   LWP 3987 "rpc worker-3987" 0x00007fee48c8bad3 in ?? ()
  82   LWP 3988 "rpc worker-3988" 0x00007fee48c8bad3 in ?? ()
  83   LWP 3989 "rpc worker-3989" 0x00007fee48c8bad3 in ?? ()
  84   LWP 3990 "rpc worker-3990" 0x00007fee48c8bad3 in ?? ()
  85   LWP 3991 "rpc worker-3991" 0x00007fee48c8bad3 in ?? ()
  86   LWP 3992 "rpc worker-3992" 0x00007fee48c8bad3 in ?? ()
  87   LWP 3993 "rpc worker-3993" 0x00007fee48c8bad3 in ?? ()
  88   LWP 3994 "rpc worker-3994" 0x00007fee48c8bad3 in ?? ()
  89   LWP 3995 "rpc worker-3995" 0x00007fee48c8bad3 in ?? ()
  90   LWP 3996 "rpc worker-3996" 0x00007fee48c8bad3 in ?? ()
  91   LWP 3997 "rpc worker-3997" 0x00007fee48c8bad3 in ?? ()
  92   LWP 3998 "rpc worker-3998" 0x00007fee48c8bad3 in ?? ()
  93   LWP 3999 "rpc worker-3999" 0x00007fee48c8bad3 in ?? ()
  94   LWP 4000 "rpc worker-4000" 0x00007fee48c8bad3 in ?? ()
  95   LWP 4001 "rpc worker-4001" 0x00007fee48c8bad3 in ?? ()
  96   LWP 4002 "rpc worker-4002" 0x00007fee48c8bad3 in ?? ()
  97   LWP 4003 "rpc worker-4003" 0x00007fee48c8bad3 in ?? ()
  98   LWP 4004 "rpc worker-4004" 0x00007fee48c8bad3 in ?? ()
  99   LWP 4005 "rpc worker-4005" 0x00007fee48c8bad3 in ?? ()
  100  LWP 4006 "rpc worker-4006" 0x00007fee48c8bad3 in ?? ()
  101  LWP 4007 "rpc worker-4007" 0x00007fee48c8bad3 in ?? ()
  102  LWP 4008 "rpc worker-4008" 0x00007fee48c8bad3 in ?? ()
  103  LWP 4009 "rpc worker-4009" 0x00007fee48c8bad3 in ?? ()
  104  LWP 4010 "rpc worker-4010" 0x00007fee48c8bad3 in ?? ()
  105  LWP 4011 "rpc worker-4011" 0x00007fee48c8bad3 in ?? ()
  106  LWP 4012 "rpc worker-4012" 0x00007fee48c8bad3 in ?? ()
  107  LWP 4013 "rpc worker-4013" 0x00007fee48c8bad3 in ?? ()
  108  LWP 4014 "rpc worker-4014" 0x00007fee48c8bad3 in ?? ()
  109  LWP 4015 "rpc worker-4015" 0x00007fee48c8bad3 in ?? ()
  110  LWP 4016 "rpc worker-4016" 0x00007fee48c8bad3 in ?? ()
  111  LWP 4017 "rpc worker-4017" 0x00007fee48c8bad3 in ?? ()
  112  LWP 4018 "rpc worker-4018" 0x00007fee48c8bad3 in ?? ()
  113  LWP 4019 "rpc worker-4019" 0x00007fee48c8bad3 in ?? ()
  114  LWP 4020 "rpc worker-4020" 0x00007fee48c8bad3 in ?? ()
  115  LWP 4021 "rpc worker-4021" 0x00007fee48c8bad3 in ?? ()
  116  LWP 4022 "rpc worker-4022" 0x00007fee48c8bad3 in ?? ()
  117  LWP 4023 "rpc worker-4023" 0x00007fee48c8bad3 in ?? ()
  118  LWP 4024 "diag-logger-402" 0x00007fee48c9008f in ?? ()
  119  LWP 4025 "result-tracker-" 0x00007fee48c8bfb9 in ?? ()
  120  LWP 4026 "excess-log-dele" 0x00007fee48c8bfb9 in ?? ()
  121  LWP 4027 "acceptor-4027" 0x00007fee440920c7 in ?? ()
  122  LWP 4028 "heartbeat-4028" 0x00007fee48c8bfb9 in ?? ()
  123  LWP 4029 "maintenance_sch" 0x00007fee48c8bfb9 in ?? ()
  124  LWP 4337 "wal-append [wor" 0x00007fee48c8bfb9 in ?? ()
  125  LWP 4519 "raft [worker]-4" 0x00007fee48c8bfb9 in ?? ()

Thread 125 (LWP 4519):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 124 (LWP 4337):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007b10000563f0 in ?? ()
#2  0x00000000000010e9 in ?? ()
#3  0x0000000100000081 in ?? ()
#4  0x00007b640006001c in ?? ()
#5  0x00007fedfa4bd440 in ?? ()
#6  0x00000000000021d3 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x00007fedfa4bd460 in ?? ()
#9  0x0000000200000000 in ?? ()
#10 0x0000008000000189 in ?? ()
#11 0x00007fee4210e008 in ?? ()
#12 0x00007fed00000001 in ?? ()
#13 0x0000000000000000 in ?? ()

Thread 123 (LWP 4029):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007f0100000000 in ?? ()
#2  0x0000000000000103 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007fedfceb96c0 in ?? ()
#6  0x0000000000000206 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 4028):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 4027):
#0  0x00007fee440920c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 4026):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007fedfe6bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffebb936b00 in ?? ()
#5  0x00007fedfe6bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 4025):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000085352f88 in ?? ()
#2  0x0000000000000041 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007fedfeebd800 in ?? ()
#6  0x0000000000000082 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 4024):
#0  0x00007fee48c9008f in ?? ()
#1  0x000000000000000a in ?? ()
#2  0x01ece0000f5c9e2a in ?? ()
#3  0x00007fedff6bdb30 in ?? ()
#4  0x00007fedff6bfb80 in ?? ()
#5  0x00007fedff6bdb30 in ?? ()
#6  0x0000000000000029 in ?? ()
#7  0x00007fedff6bdfc8 in ?? ()
#8  0x000000000046ddd7 in __sanitizer::internal_alloc_placeholder ()
#9  0x000000000046dd49 in __sanitizer::internal_alloc_placeholder ()
#10 0x00007fedff6bdeb9 in ?? ()
#11 0x00007fedff6bfb80 in ?? ()
#12 0x00007fee453fb4b3 in ?? ()
#13 0x00007fedff6b0000 in ?? ()
#14 0x00007fee453fb2b2 in ?? ()
#15 0x00007fedff6b0000 in ?? ()
#16 0x00007fedff6bde4b in ?? ()
#17 0x00007fedff6be0c0 in ?? ()
#18 0x0000000000000025 in ?? ()
#19 0x00007fedff6bde4a in ?? ()
#20 0x00000fb92c330000 in ?? ()
#21 0x0000000000000040 in ?? ()
#22 0x00007fee453fb4b3 in ?? ()
#23 0x3062306332396266 in ?? ()
#24 0x323962662d303030 in ?? ()
#25 0x2030303030333163 in ?? ()
#26 0x30303020702d2d72 in ?? ()
#27 0x3030203030303030 in ?? ()
#28 0x363437372031343a in ?? ()
#29 0x2020202020203031 in ?? ()
#30 0x2020202020202020 in ?? ()
#31 0x2020202020202020 in ?? ()
#32 0x69642f706d742f20 in ?? ()
#33 0x2d747365742d7473 in ?? ()
#34 0x355737336b736174 in ?? ()
#35 0x2d747365742f4b68 in ?? ()
#36 0x6e6173742f706d74 in ?? ()
#37 0x2e617461646f722e in ?? ()
#38 0x6564282039393833 in ?? ()
#39 0x660029646574656c in ?? ()
#40 0x3030333163323962 in ?? ()
#41 0x63323962662d3030 in ?? ()
#42 0x7220303030306231 in ?? ()
#43 0x3030303020702d2d in ?? ()
#44 0x3a30302030303030 in ?? ()
#45 0x3136343737203134 in ?? ()
#46 0x2020202020202030 in ?? ()
#47 0x2020202020202020 in ?? ()
#48 0x2020202020202020 in ?? ()
#49 0x7369642f706d742f in ?? ()
#50 0x742d747365742d74 in ?? ()
#51 0x68355737336b7361 in ?? ()
#52 0x742d747365742f4b in ?? ()
#53 0x2e6e6173742f706d in ?? ()
#54 0x332e617461646f72 in ?? ()
#55 0x6c65642820393938 in ?? ()
#56 0x6266002964657465 in ?? ()
#57 0x3030306231633239 in ?? ()
#58 0x3263323962662d30 in ?? ()
#59 0x2d72203030303033 in ?? ()
#60 0x303030303020702d in ?? ()
#61 0x343a303020303030 in ?? ()
#62 0x3031363437372031 in ?? ()
#63 0x2020202020202020 in ?? ()
#64 0x2020202020202020 in ?? ()
#65 0x2f20202020202020 in ?? ()
#66 0x747369642f706d74 in ?? ()
#67 0x61742d747365742d in ?? ()
#68 0x4b68355737336b73 in ?? ()
#69 0x6d742d747365742f in ?? ()
#70 0x722e6e6173742f70 in ?? ()
#71 0x38332e617461646f in ?? ()
#72 0x656c656428203939 in ?? ()
#73 0x3962660029646574 in ?? ()
#74 0x3030303033326332 in ?? ()
#75 0x623263323962662d in ?? ()
#76 0x2d2d722030303030 in ?? ()
#77 0x3030303030302070 in ?? ()
#78 0x31343a3030203030 in ?? ()
#79 0x2030313634373720 in ?? ()
#80 0x2020202020202020 in ?? ()
#81 0x2020202020202020 in ?? ()
#82 0x742f202020202020 in ?? ()
#83 0x2d747369642f706d in ?? ()
#84 0x7361742d74736574 in ?? ()
#85 0x2f4b68355737336b in ?? ()
#86 0x706d742d74736574 in ?? ()
#87 0x6f722e6e6173742f in ?? ()
#88 0x3938332e61746164 in ?? ()
#89 0x74656c6564282039 in ?? ()
#90 0x3239626600296465 in ?? ()
#91 0x2d30303030623263 in ?? ()
#92 0x3033336332396266 in ?? ()
#93 0x702d2d7220303030 in ?? ()
#94 0x3030303030303020 in ?? ()
#95 0x2031343a30302030 in ?? ()
#96 0x2020303136343737 in ?? ()
#97 0x2020202020202020 in ?? ()
#98 0x2020202020202020 in ?? ()
#99 0x6d742f2020202020 in ?? ()
#100 0x742d747369642f70 in ?? ()
#101 0x6b7361742d747365 in ?? ()
#102 0x742f4b6835573733 in ?? ()
#103 0x2f706d742d747365 in ?? ()
#104 0x646f722e6e617374 in ?? ()
#105 0x393938332e617461 in ?? ()
#106 0x6574656c65642820 in ?? ()
#107 0x6332396266002964 in ?? ()
#108 0x662d303030303333 in ?? ()
#109 0x3030623363323962 in ?? ()
#110 0x20702d2d72203030 in ?? ()
#111 0x3030303030303030 in ?? ()
#112 0x372031343a303020 in ?? ()
#113 0x2020203031363437 in ?? ()
#114 0x2020202020202020 in ?? ()
#115 0x2020202020202020 in ?? ()
#116 0x706d742f20202020 in ?? ()
#117 0x65742d747369642f in ?? ()
#118 0x336b7361742d7473 in ?? ()
#119 0x65742f4b68355737 in ?? ()
#120 0x742f706d742d7473 in ?? ()
#121 0x61646f722e6e6173 in ?? ()
#122 0x20393938332e6174 in ?? ()
#123 0x646574656c656428 in ?? ()
#124 0x3363323962660029 in ?? ()
#125 0x62662d3030303062 in ?? ()
#126 0x3030303334633239 in ?? ()
#127 0x3020702d2d722030 in ?? ()
#128 0x2030303030303030 in ?? ()
#129 0x37372031343a3030 in ?? ()
#130 0x2020202030313634 in ?? ()
#131 0x2020202020202020 in ?? ()
#132 0x2020202020202020 in ?? ()
#133 0x2f706d742f202020 in ?? ()
#134 0x7365742d74736964 in ?? ()
#135 0x37336b7361742d74 in ?? ()
#136 0x7365742f4b683557 in ?? ()
#137 0x73742f706d742d74 in ?? ()
#138 0x7461646f722e6e61 in ?? ()
#139 0x2820393938332e61 in ?? ()
#140 0x29646574656c6564 in ?? ()
#141 0x333463323962660a in ?? ()
#142 0x3962662d30303030 in ?? ()
#143 0x3030303062346332 in ?? ()
#144 0x303020702d2d7220 in ?? ()
#145 0x3020303030303030 in ?? ()
#146 0x3437372031343a30 in ?? ()
#147 0x2020202020303136 in ?? ()
#148 0x2020202020202020 in ?? ()
#149 0x2020202020202020 in ?? ()
#150 0x642f706d742f2020 in ?? ()
#151 0x00007fedff6be020 in ?? ()
#152 0x000000000052f42c in __sanitizer::theDepot ()
#153 0x00000000004e5308 in __sanitizer::theDepot ()
#154 0xaf8a040000000000 in ?? ()
#155 0x00007fedff6be282 in ?? ()
#156 0x00000000004dfcfb in __sanitizer::theDepot ()
#157 0x01ec84000f5c07d6 in ?? ()
#158 0xffffffffffffffff in ?? ()
#159 0xffffffffffffffff in ?? ()
#160 0xffffffffffffffff in ?? ()
#161 0xffffffffffffffff in ?? ()
#162 0xffffffffffffffff in ?? ()
#163 0xffffffffffffffff in ?? ()
#164 0xffffffffffffffff in ?? ()
#165 0xffffffffffffffff in ?? ()
#166 0x00000000000003ff in ?? ()
#167 0x00007fedff6be261 in ?? ()
#168 0x00000fb92c3b0000 in ?? ()
#169 0x00007fedff6bde4f in ?? ()
#170 0x0000000000000400 in ?? ()
#171 0x00007fee45400021 in ?? ()
#172 0x00007fee4716d574 in ?? ()
#173 0x00007fedff6bde4b in ?? ()
#174 0x000000000046924d in __sanitizer::internal_alloc_placeholder ()
#175 0x00007fedff6be1d8 in ?? ()
#176 0x0000000000000000 in ?? ()

Thread 117 (LWP 4023):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 4022):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 4021):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 4020):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 4019):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 4018):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 4017):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 4016):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 4015):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 4014):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 4013):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 4012):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 4011):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 4010):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 4009):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 4008):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 4007):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 4006):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 4005):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 4004):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 4003):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 4002):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 4001):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 4000):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 3999):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 3998):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 3997):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 3996):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 3995):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 3994):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 3993):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 3992):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 3991):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 3990):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 3989):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 3988):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 3987):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 3986):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 3985):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 3984):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 3983):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000006 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007fee148ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee148ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 3982):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 75 (LWP 3981):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 74 (LWP 3980):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 3979):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 3978):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 3977):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 3976):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 3975):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 3974):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 3973):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 3972):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 3971):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 3970):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 3969):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 3968):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 3967):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 3966):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 3965):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 3964):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 3963):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007fee1ecbc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee1ecbc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c2 in ?? ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee1ecbc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 3962):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 3961):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 3960):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 3959):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 3958):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 3957):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 3956):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 3955):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 3954):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 3953):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 3952):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 3951):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 3950):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 3949):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 3948):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 3947):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 3946):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 3945):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 3944):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 3943):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000003 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007fee290be710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee290be730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee290be730 in ?? ()
#11 0x00007fee4109cc60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 3942):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x00000000000004cb in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005d7fc in ?? ()
#4  0x00007fee29ab6710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee29ab6730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee29ab6730 in ?? ()
#11 0x00007fedfc4afca0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 3941):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x000000000000032b in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b2400058ffc in ?? ()
#4  0x00007fee2a2b7710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2a2b7730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2a2b7730 in ?? ()
#11 0x00007fedfaeacaa0 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 34 (LWP 3940):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x00000000000000a2 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24000547f8 in ?? ()
#4  0x00007fee2aab8710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2aab8730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 33 (LWP 3939):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240004fffc in ?? ()
#4  0x00007fee2b2b9710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2b2b9730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2b2b9730 in ?? ()
#11 0x00007fee415ffc60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 32 (LWP 3938):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240004900c in ?? ()
#4  0x00007fee2baba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2baba730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2baba730 in ?? ()
#11 0x00007fee415f7c60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 31 (LWP 3937):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240004480c in ?? ()
#4  0x00007fee2c2bb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007fee2c2bb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007fee48c8b770 in ?? ()
#10 0x00007fee2c2bb730 in ?? ()
#11 0x00007fee415efc60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 30 (LWP 3936):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 3935):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 3934):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 3933):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 3932):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 3931):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 3930):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 3929):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 3928):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 3927):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 3926):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 3925):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 3924):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 3923):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000017a335c0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007fee3368e700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 3922):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007fee33e8f9a8 in ?? ()
#2  0x000000000000000d in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000372d8 in ?? ()
#5  0x00007fee33e8f840 in ?? ()
#6  0x000000000000001a in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 3921):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007fee34690410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 3920):
#0  0x00007fee48c8bad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 3919):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 3918):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 3917):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 3916):
#0  0x00007fee44090a47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 3913):
#0  0x00007fee44083cb9 in ?? ()
#1  0x00007fee3cebcc10 in ?? ()
#2  0x00007b0400009010 in ?? ()
#3  0x00007fee3cebdb80 in ?? ()
#4  0x00007fee3cebcc10 in ?? ()
#5  0x00007b0400009010 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007fee419c2000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007fee3cebdb80 in ?? ()
#10 0x00007fee4da68908 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 3912):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000600000000000 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400034018 in ?? ()
#5  0x00007fee3c6bb7f0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 7 (LWP 3911):
#0  0x00007fee48c8f9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 3904):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x00007fee3debea40 in ?? ()
#2  0x000000000000014a in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400035b98 in ?? ()
#5  0x00007fee3debe5d0 in ?? ()
#6  0x0000000000000294 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 3903):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 3902):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 3901):
#0  0x00007fee48c8bfb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 3900):
#0  0x00007fee440537a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 3899):
#0  0x00007fee48c8fd50 in ?? ()
#1  0x00007ffebb936970 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007fee432b1cc0 in ?? ()
#4  0x00007fee432b1cc0 in ?? ()
#5  0x00007ffebb936910 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0xffffffff00aa1299 in ?? ()
#9  0x00007fee432b1cc0 in ?? ()
#10 0x00007fee471b7f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 19:59:11.730051  3133 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID 68b8a31f84ff465b97d2cb3c2fa3a021 and pid 4032
************************ BEGIN STACKS **************************
[New LWP 4033]
[New LWP 4034]
[New LWP 4035]
[New LWP 4036]
[New LWP 4037]
[New LWP 4044]
[New LWP 4045]
[New LWP 4046]
[New LWP 4049]
[New LWP 4050]
[New LWP 4051]
[New LWP 4052]
[New LWP 4053]
[New LWP 4054]
[New LWP 4055]
[New LWP 4056]
[New LWP 4057]
[New LWP 4058]
[New LWP 4059]
[New LWP 4060]
[New LWP 4061]
[New LWP 4062]
[New LWP 4063]
[New LWP 4064]
[New LWP 4065]
[New LWP 4066]
[New LWP 4067]
[New LWP 4068]
[New LWP 4069]
[New LWP 4070]
[New LWP 4071]
[New LWP 4072]
[New LWP 4073]
[New LWP 4074]
[New LWP 4075]
[New LWP 4076]
[New LWP 4077]
[New LWP 4078]
[New LWP 4079]
[New LWP 4080]
[New LWP 4081]
[New LWP 4082]
[New LWP 4083]
[New LWP 4084]
[New LWP 4085]
[New LWP 4086]
[New LWP 4087]
[New LWP 4088]
[New LWP 4089]
[New LWP 4090]
[New LWP 4091]
[New LWP 4092]
[New LWP 4093]
[New LWP 4094]
[New LWP 4095]
[New LWP 4096]
[New LWP 4097]
[New LWP 4098]
[New LWP 4099]
[New LWP 4100]
[New LWP 4101]
[New LWP 4102]
[New LWP 4103]
[New LWP 4104]
[New LWP 4105]
[New LWP 4106]
[New LWP 4107]
[New LWP 4108]
[New LWP 4109]
[New LWP 4110]
[New LWP 4111]
[New LWP 4112]
[New LWP 4113]
[New LWP 4114]
[New LWP 4115]
[New LWP 4116]
[New LWP 4117]
[New LWP 4118]
[New LWP 4119]
[New LWP 4120]
[New LWP 4121]
[New LWP 4122]
[New LWP 4123]
[New LWP 4124]
[New LWP 4125]
[New LWP 4126]
[New LWP 4127]
[New LWP 4128]
[New LWP 4129]
[New LWP 4130]
[New LWP 4131]
[New LWP 4132]
[New LWP 4133]
[New LWP 4134]
[New LWP 4135]
[New LWP 4136]
[New LWP 4137]
[New LWP 4138]
[New LWP 4139]
[New LWP 4140]
[New LWP 4141]
[New LWP 4142]
[New LWP 4143]
[New LWP 4144]
[New LWP 4145]
[New LWP 4146]
[New LWP 4147]
[New LWP 4148]
[New LWP 4149]
[New LWP 4150]
[New LWP 4151]
[New LWP 4152]
[New LWP 4153]
[New LWP 4154]
[New LWP 4155]
[New LWP 4156]
[New LWP 4157]
[New LWP 4158]
[New LWP 4159]
[New LWP 4160]
[New LWP 4161]
[New LWP 4162]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f8ec558bd50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 4032 "kudu"   0x00007f8ec558bd50 in ?? ()
  2    LWP 4033 "kudu"   0x00007f8ec094f7a0 in ?? ()
  3    LWP 4034 "kudu"   0x00007f8ec5587fb9 in ?? ()
  4    LWP 4035 "kudu"   0x00007f8ec5587fb9 in ?? ()
  5    LWP 4036 "kudu"   0x00007f8ec5587fb9 in ?? ()
  6    LWP 4037 "kernel-watcher-" 0x00007f8ec5587fb9 in ?? ()
  7    LWP 4044 "ntp client-4044" 0x00007f8ec558b9e2 in ?? ()
  8    LWP 4045 "file cache-evic" 0x00007f8ec5587fb9 in ?? ()
  9    LWP 4046 "sq_acceptor" 0x00007f8ec097fcb9 in ?? ()
  10   LWP 4049 "rpc reactor-404" 0x00007f8ec098ca47 in ?? ()
  11   LWP 4050 "rpc reactor-405" 0x00007f8ec098ca47 in ?? ()
  12   LWP 4051 "rpc reactor-405" 0x00007f8ec098ca47 in ?? ()
  13   LWP 4052 "rpc reactor-405" 0x00007f8ec098ca47 in ?? ()
  14   LWP 4053 "MaintenanceMgr " 0x00007f8ec5587ad3 in ?? ()
  15   LWP 4054 "txn-status-mana" 0x00007f8ec5587fb9 in ?? ()
  16   LWP 4055 "collect_and_rem" 0x00007f8ec5587fb9 in ?? ()
  17   LWP 4056 "tc-session-exp-" 0x00007f8ec5587fb9 in ?? ()
  18   LWP 4057 "rpc worker-4057" 0x00007f8ec5587ad3 in ?? ()
  19   LWP 4058 "rpc worker-4058" 0x00007f8ec5587ad3 in ?? ()
  20   LWP 4059 "rpc worker-4059" 0x00007f8ec5587ad3 in ?? ()
  21   LWP 4060 "rpc worker-4060" 0x00007f8ec5587ad3 in ?? ()
  22   LWP 4061 "rpc worker-4061" 0x00007f8ec5587ad3 in ?? ()
  23   LWP 4062 "rpc worker-4062" 0x00007f8ec5587ad3 in ?? ()
  24   LWP 4063 "rpc worker-4063" 0x00007f8ec5587ad3 in ?? ()
  25   LWP 4064 "rpc worker-4064" 0x00007f8ec5587ad3 in ?? ()
  26   LWP 4065 "rpc worker-4065" 0x00007f8ec5587ad3 in ?? ()
  27   LWP 4066 "rpc worker-4066" 0x00007f8ec5587ad3 in ?? ()
  28   LWP 4067 "rpc worker-4067" 0x00007f8ec5587ad3 in ?? ()
  29   LWP 4068 "rpc worker-4068" 0x00007f8ec5587ad3 in ?? ()
  30   LWP 4069 "rpc worker-4069" 0x00007f8ec5587ad3 in ?? ()
  31   LWP 4070 "rpc worker-4070" 0x00007f8ec5587ad3 in ?? ()
  32   LWP 4071 "rpc worker-4071" 0x00007f8ec5587ad3 in ?? ()
  33   LWP 4072 "rpc worker-4072" 0x00007f8ec5587ad3 in ?? ()
  34   LWP 4073 "rpc worker-4073" 0x00007f8ec5587ad3 in ?? ()
  35   LWP 4074 "rpc worker-4074" 0x00007f8ec5587ad3 in ?? ()
  36   LWP 4075 "rpc worker-4075" 0x00007f8ec5587ad3 in ?? ()
  37   LWP 4076 "rpc worker-4076" 0x00007f8ec5587ad3 in ?? ()
  38   LWP 4077 "rpc worker-4077" 0x00007f8ec5587ad3 in ?? ()
  39   LWP 4078 "rpc worker-4078" 0x00007f8ec5587ad3 in ?? ()
  40   LWP 4079 "rpc worker-4079" 0x00007f8ec5587ad3 in ?? ()
  41   LWP 4080 "rpc worker-4080" 0x00007f8ec5587ad3 in ?? ()
  42   LWP 4081 "rpc worker-4081" 0x00007f8ec5587ad3 in ?? ()
  43   LWP 4082 "rpc worker-4082" 0x00007f8ec5587ad3 in ?? ()
  44   LWP 4083 "rpc worker-4083" 0x00007f8ec5587ad3 in ?? ()
  45   LWP 4084 "rpc worker-4084" 0x00007f8ec5587ad3 in ?? ()
  46   LWP 4085 "rpc worker-4085" 0x00007f8ec5587ad3 in ?? ()
  47   LWP 4086 "rpc worker-4086" 0x00007f8ec5587ad3 in ?? ()
  48   LWP 4087 "rpc worker-4087" 0x00007f8ec5587ad3 in ?? ()
  49   LWP 4088 "rpc worker-4088" 0x00007f8ec5587ad3 in ?? ()
  50   LWP 4089 "rpc worker-4089" 0x00007f8ec5587ad3 in ?? ()
  51   LWP 4090 "rpc worker-4090" 0x00007f8ec5587ad3 in ?? ()
  52   LWP 4091 "rpc worker-4091" 0x00007f8ec5587ad3 in ?? ()
  53   LWP 4092 "rpc worker-4092" 0x00007f8ec5587ad3 in ?? ()
  54   LWP 4093 "rpc worker-4093" 0x00007f8ec5587ad3 in ?? ()
  55   LWP 4094 "rpc worker-4094" 0x00007f8ec5587ad3 in ?? ()
  56   LWP 4095 "rpc worker-4095" 0x00007f8ec5587ad3 in ?? ()
  57   LWP 4096 "rpc worker-4096" 0x00007f8ec5587ad3 in ?? ()
  58   LWP 4097 "rpc worker-4097" 0x00007f8ec5587ad3 in ?? ()
  59   LWP 4098 "rpc worker-4098" 0x00007f8ec5587ad3 in ?? ()
  60   LWP 4099 "rpc worker-4099" 0x00007f8ec5587ad3 in ?? ()
  61   LWP 4100 "rpc worker-4100" 0x00007f8ec5587ad3 in ?? ()
  62   LWP 4101 "rpc worker-4101" 0x00007f8ec5587ad3 in ?? ()
  63   LWP 4102 "rpc worker-4102" 0x00007f8ec5587ad3 in ?? ()
  64   LWP 4103 "rpc worker-4103" 0x00007f8ec5587ad3 in ?? ()
  65   LWP 4104 "rpc worker-4104" 0x00007f8ec5587ad3 in ?? ()
  66   LWP 4105 "rpc worker-4105" 0x00007f8ec5587ad3 in ?? ()
  67   LWP 4106 "rpc worker-4106" 0x00007f8ec5587ad3 in ?? ()
  68   LWP 4107 "rpc worker-4107" 0x00007f8ec5587ad3 in ?? ()
  69   LWP 4108 "rpc worker-4108" 0x00007f8ec5587ad3 in ?? ()
  70   LWP 4109 "rpc worker-4109" 0x00007f8ec5587ad3 in ?? ()
  71   LWP 4110 "rpc worker-4110" 0x00007f8ec5587ad3 in ?? ()
  72   LWP 4111 "rpc worker-4111" 0x00007f8ec5587ad3 in ?? ()
  73   LWP 4112 "rpc worker-4112" 0x00007f8ec5587ad3 in ?? ()
  74   LWP 4113 "rpc worker-4113" 0x00007f8ec5587ad3 in ?? ()
  75   LWP 4114 "rpc worker-4114" 0x00007f8ec5587ad3 in ?? ()
  76   LWP 4115 "rpc worker-4115" 0x00007f8ec5587ad3 in ?? ()
  77   LWP 4116 "rpc worker-4116" 0x00007f8ec5587ad3 in ?? ()
  78   LWP 4117 "rpc worker-4117" 0x00007f8ec5587ad3 in ?? ()
  79   LWP 4118 "rpc worker-4118" 0x00007f8ec5587ad3 in ?? ()
  80   LWP 4119 "rpc worker-4119" 0x00007f8ec5587ad3 in ?? ()
  81   LWP 4120 "rpc worker-4120" 0x00007f8ec5587ad3 in ?? ()
  82   LWP 4121 "rpc worker-4121" 0x00007f8ec5587ad3 in ?? ()
  83   LWP 4122 "rpc worker-4122" 0x00007f8ec5587ad3 in ?? ()
  84   LWP 4123 "rpc worker-4123" 0x00007f8ec5587ad3 in ?? ()
  85   LWP 4124 "rpc worker-4124" 0x00007f8ec5587ad3 in ?? ()
  86   LWP 4125 "rpc worker-4125" 0x00007f8ec5587ad3 in ?? ()
  87   LWP 4126 "rpc worker-4126" 0x00007f8ec5587ad3 in ?? ()
  88   LWP 4127 "rpc worker-4127" 0x00007f8ec5587ad3 in ?? ()
  89   LWP 4128 "rpc worker-4128" 0x00007f8ec5587ad3 in ?? ()
  90   LWP 4129 "rpc worker-4129" 0x00007f8ec5587ad3 in ?? ()
  91   LWP 4130 "rpc worker-4130" 0x00007f8ec5587ad3 in ?? ()
  92   LWP 4131 "rpc worker-4131" 0x00007f8ec5587ad3 in ?? ()
  93   LWP 4132 "rpc worker-4132" 0x00007f8ec5587ad3 in ?? ()
  94   LWP 4133 "rpc worker-4133" 0x00007f8ec5587ad3 in ?? ()
  95   LWP 4134 "rpc worker-4134" 0x00007f8ec5587ad3 in ?? ()
  96   LWP 4135 "rpc worker-4135" 0x00007f8ec5587ad3 in ?? ()
  97   LWP 4136 "rpc worker-4136" 0x00007f8ec5587ad3 in ?? ()
  98   LWP 4137 "rpc worker-4137" 0x00007f8ec5587ad3 in ?? ()
  99   LWP 4138 "rpc worker-4138" 0x00007f8ec5587ad3 in ?? ()
  100  LWP 4139 "rpc worker-4139" 0x00007f8ec5587ad3 in ?? ()
  101  LWP 4140 "rpc worker-4140" 0x00007f8ec5587ad3 in ?? ()
  102  LWP 4141 "rpc worker-4141" 0x00007f8ec5587ad3 in ?? ()
  103  LWP 4142 "rpc worker-4142" 0x00007f8ec5587ad3 in ?? ()
  104  LWP 4143 "rpc worker-4143" 0x00007f8ec5587ad3 in ?? ()
  105  LWP 4144 "rpc worker-4144" 0x00007f8ec5587ad3 in ?? ()
  106  LWP 4145 "rpc worker-4145" 0x00007f8ec5587ad3 in ?? ()
  107  LWP 4146 "rpc worker-4146" 0x00007f8ec5587ad3 in ?? ()
  108  LWP 4147 "rpc worker-4147" 0x00007f8ec5587ad3 in ?? ()
  109  LWP 4148 "rpc worker-4148" 0x00007f8ec5587ad3 in ?? ()
  110  LWP 4149 "rpc worker-4149" 0x00007f8ec5587ad3 in ?? ()
  111  LWP 4150 "rpc worker-4150" 0x00007f8ec5587ad3 in ?? ()
  112  LWP 4151 "rpc worker-4151" 0x00007f8ec5587ad3 in ?? ()
  113  LWP 4152 "rpc worker-4152" 0x00007f8ec5587ad3 in ?? ()
  114  LWP 4153 "rpc worker-4153" 0x00007f8ec5587ad3 in ?? ()
  115  LWP 4154 "rpc worker-4154" 0x00007f8ec5587ad3 in ?? ()
  116  LWP 4155 "rpc worker-4155" 0x00007f8ec5587ad3 in ?? ()
  117  LWP 4156 "rpc worker-4156" 0x00007f8ec5587ad3 in ?? ()
  118  LWP 4157 "diag-logger-415" 0x00007f8ec5587fb9 in ?? ()
  119  LWP 4158 "result-tracker-" 0x00007f8ec5587fb9 in ?? ()
  120  LWP 4159 "excess-log-dele" 0x00007f8ec5587fb9 in ?? ()
  121  LWP 4160 "acceptor-4160" 0x00007f8ec098e0c7 in ?? ()
  122  LWP 4161 "heartbeat-4161" 0x00007f8ec5587fb9 in ?? ()
  123  LWP 4162 "maintenance_sch" 0x00007f8ec5587fb9 in ?? ()

Thread 123 (LWP 4162):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f0100000000 in ?? ()
#2  0x0000000000000101 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f8e797b96c0 in ?? ()
#6  0x0000000000000202 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 4161):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 4160):
#0  0x00007f8ec098e0c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 4159):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8e7afbc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007ffcd87f77b0 in ?? ()
#5  0x00007f8e7afbc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 4158):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000085352f88 in ?? ()
#2  0x0000000000000040 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f8e7b7bd800 in ?? ()
#6  0x0000000000000080 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 4157):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8ebea0e008 in ?? ()
#2  0x0000000000000040 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f8e7bfbe750 in ?? ()
#6  0x0000000000000080 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 4156):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 4155):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 4154):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 4153):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 4152):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 4151):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 4150):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 4149):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 4148):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 4147):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 4146):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 4145):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 4144):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 4143):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 4142):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 4141):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 4140):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 4139):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 4138):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 4137):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 4136):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 4135):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 4134):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 4133):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 4132):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 4131):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 4130):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 4129):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 4128):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 4127):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 4126):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 4125):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 4124):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 4123):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 4122):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 4121):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 4120):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 4119):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 4118):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 4117):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 4116):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000766 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f8e911ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e911ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 4115):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000005 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f8e919bb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e919bb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 75 (LWP 4114):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x000000000000096f in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010d7dc in ?? ()
#4  0x00007f8e921bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e921bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 74 (LWP 4113):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 4112):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 4111):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 4110):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 4109):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 4108):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 4107):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 4106):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 4105):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 4104):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 4103):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 4102):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 4101):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 4100):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 4099):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 4098):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 4097):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 4096):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f8e9b5bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8e9b5bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c2 in ?? ()
#9  0x00007f8ec5587770 in ?? ()
#10 0x00007f8e9b5bc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 4095):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 4094):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 4093):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 4092):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 4091):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 4090):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 4089):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 4088):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 4087):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 4086):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 4085):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 4084):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 4083):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 4082):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 4081):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 4080):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 4079):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 4078):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 4077):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 4076):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000003 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005ffec in ?? ()
#4  0x00007f8ea59be710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f8ea59be730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f8ec5587770 in ?? ()
#10 0x00007f8ea59be730 in ?? ()
#11 0x00007f8eb65b5c60 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 36 (LWP 4075):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 35 (LWP 4074):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 4073):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 4072):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 4071):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 4070):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 4069):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 4068):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 4067):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 4066):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 4065):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 4064):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 4063):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 4062):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 4061):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 4060):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 4059):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 4058):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 4057):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 4056):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000017a335c0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f8eaff8e700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 4055):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8eb078f9a8 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000372d8 in ?? ()
#5  0x00007f8eb078f840 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 4054):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f8eb0f90410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 4053):
#0  0x00007f8ec5587ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 4052):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 4051):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 4050):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 4049):
#0  0x00007f8ec098ca47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 4046):
#0  0x00007f8ec097fcb9 in ?? ()
#1  0x00007f8eb97bcc10 in ?? ()
#2  0x00007b040000a050 in ?? ()
#3  0x00007f8eb97bdb80 in ?? ()
#4  0x00007f8eb97bcc10 in ?? ()
#5  0x00007b040000a050 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f8ebe402000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f8eb97bdb80 in ?? ()
#10 0x00007f8eca364908 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 4045):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 7 (LWP 4044):
#0  0x00007f8ec558b9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 4037):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x00007f8eba7bea40 in ?? ()
#2  0x0000000000000148 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400035b98 in ?? ()
#5  0x00007f8eba7be5d0 in ?? ()
#6  0x0000000000000290 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 4036):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 4035):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 4034):
#0  0x00007f8ec5587fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 4033):
#0  0x00007f8ec094f7a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 4032):
#0  0x00007f8ec558bd50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f8ebfbadcc0 in ?? ()
#4  0x00007f8ebfbadcc0 in ?? ()
#5  0x00007ffcd87f75c0 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a94924 in ?? ()
#9  0x00007f8ebfbadcc0 in ?? ()
#10 0x00007f8ec3ab3f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 19:59:12.704610  3133 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID 9303e28334fc44629868721544e0dc96 and pid 4165
************************ BEGIN STACKS **************************
[New LWP 4166]
[New LWP 4167]
[New LWP 4168]
[New LWP 4169]
[New LWP 4171]
[New LWP 4178]
[New LWP 4179]
[New LWP 4180]
[New LWP 4183]
[New LWP 4184]
[New LWP 4185]
[New LWP 4186]
[New LWP 4187]
[New LWP 4188]
[New LWP 4189]
[New LWP 4190]
[New LWP 4191]
[New LWP 4192]
[New LWP 4193]
[New LWP 4194]
[New LWP 4195]
[New LWP 4196]
[New LWP 4197]
[New LWP 4198]
[New LWP 4199]
[New LWP 4200]
[New LWP 4201]
[New LWP 4202]
[New LWP 4203]
[New LWP 4204]
[New LWP 4205]
[New LWP 4206]
[New LWP 4207]
[New LWP 4208]
[New LWP 4209]
[New LWP 4210]
[New LWP 4211]
[New LWP 4212]
[New LWP 4213]
[New LWP 4214]
[New LWP 4215]
[New LWP 4216]
[New LWP 4217]
[New LWP 4218]
[New LWP 4219]
[New LWP 4220]
[New LWP 4221]
[New LWP 4222]
[New LWP 4223]
[New LWP 4224]
[New LWP 4225]
[New LWP 4226]
[New LWP 4227]
[New LWP 4228]
[New LWP 4229]
[New LWP 4230]
[New LWP 4231]
[New LWP 4232]
[New LWP 4233]
[New LWP 4234]
[New LWP 4235]
[New LWP 4236]
[New LWP 4237]
[New LWP 4238]
[New LWP 4239]
[New LWP 4240]
[New LWP 4241]
[New LWP 4242]
[New LWP 4243]
[New LWP 4244]
[New LWP 4245]
[New LWP 4246]
[New LWP 4247]
[New LWP 4248]
[New LWP 4249]
[New LWP 4250]
[New LWP 4251]
[New LWP 4252]
[New LWP 4253]
[New LWP 4254]
[New LWP 4255]
[New LWP 4256]
[New LWP 4257]
[New LWP 4258]
[New LWP 4259]
[New LWP 4260]
[New LWP 4261]
[New LWP 4262]
[New LWP 4263]
[New LWP 4264]
[New LWP 4265]
[New LWP 4266]
[New LWP 4267]
[New LWP 4268]
[New LWP 4269]
[New LWP 4270]
[New LWP 4271]
[New LWP 4272]
[New LWP 4273]
[New LWP 4274]
[New LWP 4275]
[New LWP 4276]
[New LWP 4277]
[New LWP 4278]
[New LWP 4279]
[New LWP 4280]
[New LWP 4281]
[New LWP 4282]
[New LWP 4283]
[New LWP 4284]
[New LWP 4285]
[New LWP 4286]
[New LWP 4287]
[New LWP 4288]
[New LWP 4289]
[New LWP 4290]
[New LWP 4291]
[New LWP 4292]
[New LWP 4293]
[New LWP 4294]
[New LWP 4295]
[New LWP 4296]
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c48020396
Cannot access memory at address 0x4108070c4802038e
0x00007f07b4aaad50 in ?? ()
  Id   Target Id         Frame 
* 1    LWP 4165 "kudu"   0x00007f07b4aaad50 in ?? ()
  2    LWP 4166 "kudu"   0x00007f07afe6e7a0 in ?? ()
  3    LWP 4167 "kudu"   0x00007f07b4aa6fb9 in ?? ()
  4    LWP 4168 "kudu"   0x00007f07b4aa6fb9 in ?? ()
  5    LWP 4169 "kudu"   0x00007f07b4aa6fb9 in ?? ()
  6    LWP 4171 "kernel-watcher-" 0x00007f07b4aa6fb9 in ?? ()
  7    LWP 4178 "ntp client-4178" 0x00007f07b4aaa9e2 in ?? ()
  8    LWP 4179 "file cache-evic" 0x00007f07b4aa6fb9 in ?? ()
  9    LWP 4180 "sq_acceptor" 0x00007f07afe9ecb9 in ?? ()
  10   LWP 4183 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  11   LWP 4184 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  12   LWP 4185 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  13   LWP 4186 "rpc reactor-418" 0x00007f07afeaba47 in ?? ()
  14   LWP 4187 "MaintenanceMgr " 0x00007f07b4aa6ad3 in ?? ()
  15   LWP 4188 "txn-status-mana" 0x00007f07b4aa6fb9 in ?? ()
  16   LWP 4189 "collect_and_rem" 0x00007f07b4aa6fb9 in ?? ()
  17   LWP 4190 "tc-session-exp-" 0x00007f07b4aa6fb9 in ?? ()
  18   LWP 4191 "rpc worker-4191" 0x00007f07b4aa6ad3 in ?? ()
  19   LWP 4192 "rpc worker-4192" 0x00007f07b4aa6ad3 in ?? ()
  20   LWP 4193 "rpc worker-4193" 0x00007f07b4aa6ad3 in ?? ()
  21   LWP 4194 "rpc worker-4194" 0x00007f07b4aa6ad3 in ?? ()
  22   LWP 4195 "rpc worker-4195" 0x00007f07b4aa6ad3 in ?? ()
  23   LWP 4196 "rpc worker-4196" 0x00007f07b4aa6ad3 in ?? ()
  24   LWP 4197 "rpc worker-4197" 0x00007f07b4aa6ad3 in ?? ()
  25   LWP 4198 "rpc worker-4198" 0x00007f07b4aa6ad3 in ?? ()
  26   LWP 4199 "rpc worker-4199" 0x00007f07b4aa6ad3 in ?? ()
  27   LWP 4200 "rpc worker-4200" 0x00007f07b4aa6ad3 in ?? ()
  28   LWP 4201 "rpc worker-4201" 0x00007f07b4aa6ad3 in ?? ()
  29   LWP 4202 "rpc worker-4202" 0x00007f07b4aa6ad3 in ?? ()
  30   LWP 4203 "rpc worker-4203" 0x00007f07b4aa6ad3 in ?? ()
  31   LWP 4204 "rpc worker-4204" 0x00007f07b4aa6ad3 in ?? ()
  32   LWP 4205 "rpc worker-4205" 0x00007f07b4aa6ad3 in ?? ()
  33   LWP 4206 "rpc worker-4206" 0x00007f07b4aa6ad3 in ?? ()
  34   LWP 4207 "rpc worker-4207" 0x00007f07b4aa6ad3 in ?? ()
  35   LWP 4208 "rpc worker-4208" 0x00007f07b4aa6ad3 in ?? ()
  36   LWP 4209 "rpc worker-4209" 0x00007f07b4aa6ad3 in ?? ()
  37   LWP 4210 "rpc worker-4210" 0x00007f07b4aa6ad3 in ?? ()
  38   LWP 4211 "rpc worker-4211" 0x00007f07b4aa6ad3 in ?? ()
  39   LWP 4212 "rpc worker-4212" 0x00007f07b4aa6ad3 in ?? ()
  40   LWP 4213 "rpc worker-4213" 0x00007f07b4aa6ad3 in ?? ()
  41   LWP 4214 "rpc worker-4214" 0x00007f07b4aa6ad3 in ?? ()
  42   LWP 4215 "rpc worker-4215" 0x00007f07b4aa6ad3 in ?? ()
  43   LWP 4216 "rpc worker-4216" 0x00007f07b4aa6ad3 in ?? ()
  44   LWP 4217 "rpc worker-4217" 0x00007f07b4aa6ad3 in ?? ()
  45   LWP 4218 "rpc worker-4218" 0x00007f07b4aa6ad3 in ?? ()
  46   LWP 4219 "rpc worker-4219" 0x00007f07b4aa6ad3 in ?? ()
  47   LWP 4220 "rpc worker-4220" 0x00007f07b4aa6ad3 in ?? ()
  48   LWP 4221 "rpc worker-4221" 0x00007f07b4aa6ad3 in ?? ()
  49   LWP 4222 "rpc worker-4222" 0x00007f07b4aa6ad3 in ?? ()
  50   LWP 4223 "rpc worker-4223" 0x00007f07b4aa6ad3 in ?? ()
  51   LWP 4224 "rpc worker-4224" 0x00007f07b4aa6ad3 in ?? ()
  52   LWP 4225 "rpc worker-4225" 0x00007f07b4aa6ad3 in ?? ()
  53   LWP 4226 "rpc worker-4226" 0x00007f07b4aa6ad3 in ?? ()
  54   LWP 4227 "rpc worker-4227" 0x00007f07b4aa6ad3 in ?? ()
  55   LWP 4228 "rpc worker-4228" 0x00007f07b4aa6ad3 in ?? ()
  56   LWP 4229 "rpc worker-4229" 0x00007f07b4aa6ad3 in ?? ()
  57   LWP 4230 "rpc worker-4230" 0x00007f07b4aa6ad3 in ?? ()
  58   LWP 4231 "rpc worker-4231" 0x00007f07b4aa6ad3 in ?? ()
  59   LWP 4232 "rpc worker-4232" 0x00007f07b4aa6ad3 in ?? ()
  60   LWP 4233 "rpc worker-4233" 0x00007f07b4aa6ad3 in ?? ()
  61   LWP 4234 "rpc worker-4234" 0x00007f07b4aa6ad3 in ?? ()
  62   LWP 4235 "rpc worker-4235" 0x00007f07b4aa6ad3 in ?? ()
  63   LWP 4236 "rpc worker-4236" 0x00007f07b4aa6ad3 in ?? ()
  64   LWP 4237 "rpc worker-4237" 0x00007f07b4aa6ad3 in ?? ()
  65   LWP 4238 "rpc worker-4238" 0x00007f07b4aa6ad3 in ?? ()
  66   LWP 4239 "rpc worker-4239" 0x00007f07b4aa6ad3 in ?? ()
  67   LWP 4240 "rpc worker-4240" 0x00007f07b4aa6ad3 in ?? ()
  68   LWP 4241 "rpc worker-4241" 0x00007f07b4aa6ad3 in ?? ()
  69   LWP 4242 "rpc worker-4242" 0x00007f07b4aa6ad3 in ?? ()
  70   LWP 4243 "rpc worker-4243" 0x00007f07b4aa6ad3 in ?? ()
  71   LWP 4244 "rpc worker-4244" 0x00007f07b4aa6ad3 in ?? ()
  72   LWP 4245 "rpc worker-4245" 0x00007f07b4aa6ad3 in ?? ()
  73   LWP 4246 "rpc worker-4246" 0x00007f07b4aa6ad3 in ?? ()
  74   LWP 4247 "rpc worker-4247" 0x00007f07b4aa6ad3 in ?? ()
  75   LWP 4248 "rpc worker-4248" 0x00007f07b4aa6ad3 in ?? ()
  76   LWP 4249 "rpc worker-4249" 0x00007f07b4aa6ad3 in ?? ()
  77   LWP 4250 "rpc worker-4250" 0x00007f07b4aa6ad3 in ?? ()
  78   LWP 4251 "rpc worker-4251" 0x00007f07b4aa6ad3 in ?? ()
  79   LWP 4252 "rpc worker-4252" 0x00007f07b4aa6ad3 in ?? ()
  80   LWP 4253 "rpc worker-4253" 0x00007f07b4aa6ad3 in ?? ()
  81   LWP 4254 "rpc worker-4254" 0x00007f07b4aa6ad3 in ?? ()
  82   LWP 4255 "rpc worker-4255" 0x00007f07b4aa6ad3 in ?? ()
  83   LWP 4256 "rpc worker-4256" 0x00007f07b4aa6ad3 in ?? ()
  84   LWP 4257 "rpc worker-4257" 0x00007f07b4aa6ad3 in ?? ()
  85   LWP 4258 "rpc worker-4258" 0x00007f07b4aa6ad3 in ?? ()
  86   LWP 4259 "rpc worker-4259" 0x00007f07b4aa6ad3 in ?? ()
  87   LWP 4260 "rpc worker-4260" 0x00007f07b4aa6ad3 in ?? ()
  88   LWP 4261 "rpc worker-4261" 0x00007f07b4aa6ad3 in ?? ()
  89   LWP 4262 "rpc worker-4262" 0x00007f07b4aa6ad3 in ?? ()
  90   LWP 4263 "rpc worker-4263" 0x00007f07b4aa6ad3 in ?? ()
  91   LWP 4264 "rpc worker-4264" 0x00007f07b4aa6ad3 in ?? ()
  92   LWP 4265 "rpc worker-4265" 0x00007f07b4aa6ad3 in ?? ()
  93   LWP 4266 "rpc worker-4266" 0x00007f07b4aa6ad3 in ?? ()
  94   LWP 4267 "rpc worker-4267" 0x00007f07b4aa6ad3 in ?? ()
  95   LWP 4268 "rpc worker-4268" 0x00007f07b4aa6ad3 in ?? ()
  96   LWP 4269 "rpc worker-4269" 0x00007f07b4aa6ad3 in ?? ()
  97   LWP 4270 "rpc worker-4270" 0x00007f07b4aa6ad3 in ?? ()
  98   LWP 4271 "rpc worker-4271" 0x00007f07b4aa6ad3 in ?? ()
  99   LWP 4272 "rpc worker-4272" 0x00007f07b4aa6ad3 in ?? ()
  100  LWP 4273 "rpc worker-4273" 0x00007f07b4aa6ad3 in ?? ()
  101  LWP 4274 "rpc worker-4274" 0x00007f07b4aa6ad3 in ?? ()
  102  LWP 4275 "rpc worker-4275" 0x00007f07b4aa6ad3 in ?? ()
  103  LWP 4276 "rpc worker-4276" 0x00007f07b4aa6ad3 in ?? ()
  104  LWP 4277 "rpc worker-4277" 0x00007f07b4aa6ad3 in ?? ()
  105  LWP 4278 "rpc worker-4278" 0x00007f07b4aa6ad3 in ?? ()
  106  LWP 4279 "rpc worker-4279" 0x00007f07b4aa6ad3 in ?? ()
  107  LWP 4280 "rpc worker-4280" 0x00007f07b4aa6ad3 in ?? ()
  108  LWP 4281 "rpc worker-4281" 0x00007f07b4aa6ad3 in ?? ()
  109  LWP 4282 "rpc worker-4282" 0x00007f07b4aa6ad3 in ?? ()
  110  LWP 4283 "rpc worker-4283" 0x00007f07b4aa6ad3 in ?? ()
  111  LWP 4284 "rpc worker-4284" 0x00007f07b4aa6ad3 in ?? ()
  112  LWP 4285 "rpc worker-4285" 0x00007f07b4aa6ad3 in ?? ()
  113  LWP 4286 "rpc worker-4286" 0x00007f07b4aa6ad3 in ?? ()
  114  LWP 4287 "rpc worker-4287" 0x00007f07b4aa6ad3 in ?? ()
  115  LWP 4288 "rpc worker-4288" 0x00007f07b4aa6ad3 in ?? ()
  116  LWP 4289 "rpc worker-4289" 0x00007f07b4aa6ad3 in ?? ()
  117  LWP 4290 "rpc worker-4290" 0x00007f07b4aa6ad3 in ?? ()
  118  LWP 4291 "diag-logger-429" 0x00007f07b4aa6fb9 in ?? ()
  119  LWP 4292 "result-tracker-" 0x00007f07b4aa6fb9 in ?? ()
  120  LWP 4293 "excess-log-dele" 0x00007f07b4aa6fb9 in ?? ()
  121  LWP 4294 "acceptor-4294" 0x00007f07afead0c7 in ?? ()
  122  LWP 4295 "heartbeat-4295" 0x00007f07b4aa6fb9 in ?? ()
  123  LWP 4296 "maintenance_sch" 0x00007f07b4aa6fb9 in ?? ()

Thread 123 (LWP 4296):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007b0100000000 in ?? ()
#2  0x00000000000000fd in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b54000028f0 in ?? ()
#5  0x00007f0768cb96c0 in ?? ()
#6  0x00000000000001fa in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 122 (LWP 4295):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 121 (LWP 4294):
#0  0x00007f07afead0c7 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 120 (LWP 4293):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f076a4bc940 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007fff876ef870 in ?? ()
#5  0x00007f076a4bc7b0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 119 (LWP 4292):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000085352f88 in ?? ()
#2  0x000000000000003f in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b3400001008 in ?? ()
#5  0x00007f076acbd800 in ?? ()
#6  0x000000000000007e in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 118 (LWP 4291):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f07adf0e008 in ?? ()
#2  0x000000000000003b in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4000000c90 in ?? ()
#5  0x00007f076b4be750 in ?? ()
#6  0x0000000000000076 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 117 (LWP 4290):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 116 (LWP 4289):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 115 (LWP 4288):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 114 (LWP 4287):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 113 (LWP 4286):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 112 (LWP 4285):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 111 (LWP 4284):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 110 (LWP 4283):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 109 (LWP 4282):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 108 (LWP 4281):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 107 (LWP 4280):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 106 (LWP 4279):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 105 (LWP 4278):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 104 (LWP 4277):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 103 (LWP 4276):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 102 (LWP 4275):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 101 (LWP 4274):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 100 (LWP 4273):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 99 (LWP 4272):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 98 (LWP 4271):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 97 (LWP 4270):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 96 (LWP 4269):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 95 (LWP 4268):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 94 (LWP 4267):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 93 (LWP 4266):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 92 (LWP 4265):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 91 (LWP 4264):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 90 (LWP 4263):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 89 (LWP 4262):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 88 (LWP 4261):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 87 (LWP 4260):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 86 (LWP 4259):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 85 (LWP 4258):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 84 (LWP 4257):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 83 (LWP 4256):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 82 (LWP 4255):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 81 (LWP 4254):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 80 (LWP 4253):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 79 (LWP 4252):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 78 (LWP 4251):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 77 (LWP 4250):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x00000000000008be in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b24001147c8 in ?? ()
#4  0x00007f07806ba710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f07806ba730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 76 (LWP 4249):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x00000000000007bd in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010ffcc in ?? ()
#4  0x00007f0780ebb710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0780ebb730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f07b4aa6770 in ?? ()
#10 0x00007f0780ebb730 in ?? ()
#11 0x00007f0764bd5278 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 75 (LWP 4248):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240010d7dc in ?? ()
#4  0x00007f07816bc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f07816bc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x0000000000000000 in ?? ()

Thread 74 (LWP 4247):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 73 (LWP 4246):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 72 (LWP 4245):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 71 (LWP 4244):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 70 (LWP 4243):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 69 (LWP 4242):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 68 (LWP 4241):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 67 (LWP 4240):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 66 (LWP 4239):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 65 (LWP 4238):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 64 (LWP 4237):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 63 (LWP 4236):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 62 (LWP 4235):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 61 (LWP 4234):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 60 (LWP 4233):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 59 (LWP 4232):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 58 (LWP 4231):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 57 (LWP 4230):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b24000b902c in ?? ()
#4  0x00007f078aabc710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f078aabc730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x007f0400000026c2 in ?? ()
#9  0x00007f07b4aa6770 in ?? ()
#10 0x00007f078aabc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 56 (LWP 4229):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 55 (LWP 4228):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 54 (LWP 4227):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 53 (LWP 4226):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 52 (LWP 4225):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 51 (LWP 4224):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 50 (LWP 4223):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 49 (LWP 4222):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 48 (LWP 4221):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 47 (LWP 4220):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 46 (LWP 4219):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 45 (LWP 4218):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 44 (LWP 4217):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 43 (LWP 4216):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 42 (LWP 4215):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 41 (LWP 4214):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 40 (LWP 4213):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 39 (LWP 4212):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 38 (LWP 4211):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 37 (LWP 4210):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000002 in ?? ()
#2  0x0000000000000081 in ?? ()
#3  0x00007b240005ffe8 in ?? ()
#4  0x00007f0794ebe710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f0794ebe730 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 36 (LWP 4209):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000001 in ?? ()
#2  0x0000000100000081 in ?? ()
#3  0x00007b240005d7fc in ?? ()
#4  0x00007f07958b6710 in ?? ()
#5  0x0000008000000000 in ?? ()
#6  0x00007f07958b6730 in ?? ()
#7  0x0000000000000001 in ?? ()
#8  0x000000000045e4c9 in __sanitizer::internal_alloc_placeholder ()
#9  0x00007f07b4aa6770 in ?? ()
#10 0x00007f07958b6730 in ?? ()
#11 0x00007f07ace97c48 in ?? ()
#12 0x0000000000000000 in ?? ()

Thread 35 (LWP 4208):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 34 (LWP 4207):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 33 (LWP 4206):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 32 (LWP 4205):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 31 (LWP 4204):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 30 (LWP 4203):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 29 (LWP 4202):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 28 (LWP 4201):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 27 (LWP 4200):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 26 (LWP 4199):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 25 (LWP 4198):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 24 (LWP 4197):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 23 (LWP 4196):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 22 (LWP 4195):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 21 (LWP 4194):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 20 (LWP 4193):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 19 (LWP 4192):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 18 (LWP 4191):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 17 (LWP 4190):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000017a335c0 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4800003a00 in ?? ()
#5  0x00007f079f48e700 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 16 (LWP 4189):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f079fc8f9a8 in ?? ()
#2  0x000000000000000c in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b44000372d8 in ?? ()
#5  0x00007f079fc8f840 in ?? ()
#6  0x0000000000000018 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 15 (LWP 4188):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000018 in ?? ()
#2  0x0000000000000006 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b5800000118 in ?? ()
#5  0x00007f07a0490410 in ?? ()
#6  0x000000000000000c in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 14 (LWP 4187):
#0  0x00007f07b4aa6ad3 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 13 (LWP 4186):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 12 (LWP 4185):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 11 (LWP 4184):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 10 (LWP 4183):
#0  0x00007f07afeaba47 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 9 (LWP 4180):
#0  0x00007f07afe9ecb9 in ?? ()
#1  0x00007f07a8cbcc10 in ?? ()
#2  0x00007b040000a850 in ?? ()
#3  0x00007f07a8cbdb80 in ?? ()
#4  0x00007f07a8cbcc10 in ?? ()
#5  0x00007b040000a850 in ?? ()
#6  0x00000000004888a3 in __sanitizer::internal_alloc_placeholder ()
#7  0x00007f07ad910000 in ?? ()
#8  0x0100000000000001 in ?? ()
#9  0x00007f07a8cbdb80 in ?? ()
#10 0x00007f07b9883908 in ?? ()
#11 0x0000000000000000 in ?? ()

Thread 8 (LWP 4179):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000600000000000 in ?? ()
#2  0x0000000000000001 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400034018 in ?? ()
#5  0x00007f07a84bb7f0 in ?? ()
#6  0x0000000000000002 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 7 (LWP 4178):
#0  0x00007f07b4aaa9e2 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 6 (LWP 4171):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x00007f07a9cbea40 in ?? ()
#2  0x0000000000000144 in ?? ()
#3  0x0000000000000081 in ?? ()
#4  0x00007b4400035b98 in ?? ()
#5  0x00007f07a9cbe5d0 in ?? ()
#6  0x0000000000000288 in ?? ()
#7  0x0000000000000000 in ?? ()

Thread 5 (LWP 4169):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 4 (LWP 4168):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 3 (LWP 4167):
#0  0x00007f07b4aa6fb9 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 2 (LWP 4166):
#0  0x00007f07afe6e7a0 in ?? ()
#1  0x0000000000000000 in ?? ()

Thread 1 (LWP 4165):
#0  0x00007f07b4aaad50 in ?? ()
#1  0x0000600001000078 in ?? ()
#2  0x0000000000467b2b in __sanitizer::internal_alloc_placeholder ()
#3  0x00007f07af0cccc0 in ?? ()
#4  0x00007f07af0cccc0 in ?? ()
#5  0x00007fff876ef680 in ?? ()
#6  0x000000000048aef4 in __sanitizer::internal_alloc_placeholder ()
#7  0x0000600001000078 in ?? ()
#8  0x0000e00000a9a15b in ?? ()
#9  0x00007f07af0cccc0 in ?? ()
#10 0x00007f07b2fd2f0b in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 19:59:13.687809  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3899
I20250624 19:59:13.738415  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 4032
I20250624 19:59:13.788393  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 4165
I20250624 19:59:13.841274  3133 external_mini_cluster.cc:1620] Killing /tmp/dist-test-task37W5hK/build/tsan/bin/kudu with pid 3807
2025-06-24T19:59:13Z chronyd exiting
I20250624 19:59:13.898190  3133 test_util.cc:183] -----------------------------------------------
I20250624 19:59:13.898419  3133 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-task37W5hK/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750795058047324-3133-0
[  FAILED  ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate (71853 ms)
[----------] 4 tests from TabletCopyITest (95684 ms total)

[----------] 1 test from FaultFlags/BadTabletCopyITest
[ RUN      ] FaultFlags/BadTabletCopyITest.TestBadCopy/1
/home/jenkins-slave/workspace/build_and_test_flaky@2/src/kudu/integration-tests/tablet_copy-itest.cc:1510: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[  SKIPPED ] FaultFlags/BadTabletCopyITest.TestBadCopy/1 (8 ms)
[----------] 1 test from FaultFlags/BadTabletCopyITest (8 ms total)

[----------] Global test environment tear-down
[==========] 5 tests from 2 test suites ran. (95694 ms total)
[  PASSED  ] 1 test.
[  SKIPPED ] 3 tests, listed below:
[  SKIPPED ] TabletCopyITest.TestRejectRogueLeader
[  SKIPPED ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest
[  SKIPPED ] FaultFlags/BadTabletCopyITest.TestBadCopy/1
[  FAILED  ] 1 test, listed below:
[  FAILED  ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate

 1 FAILED TEST