Note: This is test shard 1 of 6.
[==========] Running 5 tests from 2 test suites.
[----------] Global test environment set-up.
[----------] 4 tests from TabletCopyITest
[ RUN ] TabletCopyITest.TestRejectRogueLeader
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/tablet_copy-itest.cc:172: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[ SKIPPED ] TabletCopyITest.TestRejectRogueLeader (10 ms)
[ RUN ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/tablet_copy-itest.cc:727: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[ SKIPPED ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest (7 ms)
[ RUN ] TabletCopyITest.TestTabletCopyThrottling
2025-06-24T14:17:56Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T14:17:56Z Disabled control of system clock
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20250624 14:17:56.707914 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.29.222.126:39907
--webserver_interface=127.29.222.126
--webserver_port=0
--builtin_ntp_servers=127.29.222.84:42211
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.29.222.126:39907
--master_tombstone_evicted_tablet_replicas=false with env {}
W20250624 14:17:56.999058 30594 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:17:56.999673 30594 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:17:57.000130 30594 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:17:57.031000 30594 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 14:17:57.031313 30594 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:17:57.031594 30594 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 14:17:57.031836 30594 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 14:17:57.067059 30594 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:42211
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
--master_tombstone_evicted_tablet_replicas=false
--ipki_ca_key_size=768
--master_addresses=127.29.222.126:39907
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.29.222.126:39907
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.29.222.126
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:17:57.068432 30594 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:17:57.070101 30594 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:17:57.084499 30600 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:17:57.084818 30601 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:17:57.086352 30603 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:17:57.087760 30594 server_base.cc:1048] running on GCE node
I20250624 14:17:58.265031 30594 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:17:58.268196 30594 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:17:58.269657 30594 hybrid_clock.cc:648] HybridClock initialized: now 1750774678269599 us; error 82 us; skew 500 ppm
I20250624 14:17:58.270468 30594 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:17:58.277779 30594 webserver.cc:469] Webserver started at http://127.29.222.126:37251/ using document root <none> and password file <none>
I20250624 14:17:58.278731 30594 fs_manager.cc:362] Metadata directory not provided
I20250624 14:17:58.278985 30594 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:17:58.279439 30594 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:17:58.283980 30594 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/instance:
uuid: "d1211fbc127f4094b4f20ef60c0ef5f2"
format_stamp: "Formatted at 2025-06-24 14:17:58 on dist-test-slave-6xgw"
I20250624 14:17:58.285034 30594 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal/instance:
uuid: "d1211fbc127f4094b4f20ef60c0ef5f2"
format_stamp: "Formatted at 2025-06-24 14:17:58 on dist-test-slave-6xgw"
I20250624 14:17:58.292405 30594 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.001s sys 0.008s
I20250624 14:17:58.297931 30610 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:17:58.299044 30594 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20250624 14:17:58.299340 30594 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
uuid: "d1211fbc127f4094b4f20ef60c0ef5f2"
format_stamp: "Formatted at 2025-06-24 14:17:58 on dist-test-slave-6xgw"
I20250624 14:17:58.299666 30594 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:17:58.348968 30594 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:17:58.350409 30594 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:17:58.350812 30594 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:17:58.420542 30594 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.126:39907
I20250624 14:17:58.420614 30661 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.126:39907 every 8 connection(s)
I20250624 14:17:58.423216 30594 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
I20250624 14:17:58.427189 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 30594
I20250624 14:17:58.427706 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal/instance
I20250624 14:17:58.429108 30662 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:17:58.448583 30662 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Bootstrap starting.
I20250624 14:17:58.455118 30662 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Neither blocks nor log segments found. Creating new log.
I20250624 14:17:58.457382 30662 log.cc:826] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Log is configured to *not* fsync() on all Append() calls
I20250624 14:17:58.462419 30662 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: No bootstrap required, opened a new log
I20250624 14:17:58.480257 30662 raft_consensus.cc:357] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:17:58.481041 30662 raft_consensus.cc:383] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:17:58.481297 30662 raft_consensus.cc:738] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: d1211fbc127f4094b4f20ef60c0ef5f2, State: Initialized, Role: FOLLOWER
I20250624 14:17:58.482038 30662 consensus_queue.cc:260] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:17:58.482533 30662 raft_consensus.cc:397] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:17:58.482923 30662 raft_consensus.cc:491] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:17:58.483311 30662 raft_consensus.cc:3058] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:17:58.487339 30662 raft_consensus.cc:513] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:17:58.488109 30662 leader_election.cc:304] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: d1211fbc127f4094b4f20ef60c0ef5f2; no voters:
I20250624 14:17:58.489835 30662 leader_election.cc:290] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 14:17:58.490518 30667 raft_consensus.cc:2802] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:17:58.492713 30667 raft_consensus.cc:695] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 LEADER]: Becoming Leader. State: Replica: d1211fbc127f4094b4f20ef60c0ef5f2, State: Running, Role: LEADER
I20250624 14:17:58.493412 30667 consensus_queue.cc:237] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:17:58.494867 30662 sys_catalog.cc:564] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 14:17:58.504043 30669 sys_catalog.cc:455] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: SysCatalogTable state changed. Reason: New leader d1211fbc127f4094b4f20ef60c0ef5f2. Latest consensus state: current_term: 1 leader_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } } }
I20250624 14:17:58.504904 30669 sys_catalog.cc:458] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: This master's current role is: LEADER
I20250624 14:17:58.507174 30668 sys_catalog.cc:455] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } } }
I20250624 14:17:58.507930 30668 sys_catalog.cc:458] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: This master's current role is: LEADER
I20250624 14:17:58.509855 30676 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 14:17:58.522048 30676 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 14:17:58.538581 30676 catalog_manager.cc:1349] Generated new cluster ID: c7ebdd58ee8743d3aa26a8b0be53ecb1
I20250624 14:17:58.538995 30676 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 14:17:58.555032 30676 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 14:17:58.556959 30676 catalog_manager.cc:1506] Loading token signing keys...
I20250624 14:17:58.582607 30676 catalog_manager.cc:5955] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Generated new TSK 0
I20250624 14:17:58.583753 30676 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 14:17:58.604984 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.29.222.65:0
--local_ip_for_outbound_sockets=127.29.222.65
--webserver_interface=127.29.222.65
--webserver_port=0
--tserver_master_addrs=127.29.222.126:39907
--builtin_ntp_servers=127.29.222.84:42211
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
W20250624 14:17:58.898495 30686 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:17:58.899019 30686 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:17:58.899552 30686 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:17:58.931649 30686 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:17:58.932648 30686 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.29.222.65
I20250624 14:17:58.967746 30686 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:42211
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.29.222.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.29.222.65
--webserver_port=0
--tserver_master_addrs=127.29.222.126:39907
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.29.222.65
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:17:58.969414 30686 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:17:58.971082 30686 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:17:58.988719 30693 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:17:58.990303 30695 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:17:58.991066 30686 server_base.cc:1048] running on GCE node
W20250624 14:17:58.990653 30692 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:00.169611 30686 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:00.172683 30686 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:00.174228 30686 hybrid_clock.cc:648] HybridClock initialized: now 1750774680174179 us; error 65 us; skew 500 ppm
I20250624 14:18:00.175132 30686 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:00.186475 30686 webserver.cc:469] Webserver started at http://127.29.222.65:45439/ using document root <none> and password file <none>
I20250624 14:18:00.187494 30686 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:00.187758 30686 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:00.188211 30686 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:18:00.193461 30686 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data/instance:
uuid: "4463b73640674948b4b479a5108fcc5b"
format_stamp: "Formatted at 2025-06-24 14:18:00 on dist-test-slave-6xgw"
I20250624 14:18:00.194614 30686 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal/instance:
uuid: "4463b73640674948b4b479a5108fcc5b"
format_stamp: "Formatted at 2025-06-24 14:18:00 on dist-test-slave-6xgw"
I20250624 14:18:00.202237 30686 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.000s
I20250624 14:18:00.208639 30702 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:00.209836 30686 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.002s
I20250624 14:18:00.210216 30686 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal
uuid: "4463b73640674948b4b479a5108fcc5b"
format_stamp: "Formatted at 2025-06-24 14:18:00 on dist-test-slave-6xgw"
I20250624 14:18:00.210548 30686 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:00.262698 30686 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:00.264303 30686 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:00.264770 30686 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:00.267377 30686 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 14:18:00.271603 30686 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 14:18:00.271840 30686 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:00.272094 30686 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 14:18:00.272258 30686 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:00.431888 30686 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.65:40743
I20250624 14:18:00.431957 30814 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.65:40743 every 8 connection(s)
I20250624 14:18:00.435663 30686 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/data/info.pb
I20250624 14:18:00.446019 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 30686
I20250624 14:18:00.446377 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-0/wal/instance
I20250624 14:18:00.452709 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.29.222.66:0
--local_ip_for_outbound_sockets=127.29.222.66
--webserver_interface=127.29.222.66
--webserver_port=0
--tserver_master_addrs=127.29.222.126:39907
--builtin_ntp_servers=127.29.222.84:42211
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
I20250624 14:18:00.459627 30815 heartbeater.cc:344] Connected to a master server at 127.29.222.126:39907
I20250624 14:18:00.460053 30815 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:00.461045 30815 heartbeater.cc:507] Master 127.29.222.126:39907 requested a full tablet report, sending...
I20250624 14:18:00.464114 30627 ts_manager.cc:194] Registered new tserver with Master: 4463b73640674948b4b479a5108fcc5b (127.29.222.65:40743)
I20250624 14:18:00.466061 30627 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.29.222.65:47903
W20250624 14:18:00.772691 30819 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:00.773224 30819 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:00.773772 30819 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:00.806447 30819 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:00.807379 30819 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.29.222.66
I20250624 14:18:00.842932 30819 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:42211
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.29.222.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.29.222.66
--webserver_port=0
--tserver_master_addrs=127.29.222.126:39907
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.29.222.66
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:00.844241 30819 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:00.845924 30819 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:00.865622 30825 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:01.469588 30815 heartbeater.cc:499] Master 127.29.222.126:39907 was elected leader, sending a full tablet report...
W20250624 14:18:00.866833 30826 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:00.867472 30828 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:00.868165 30819 server_base.cc:1048] running on GCE node
I20250624 14:18:02.053455 30819 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:02.056370 30819 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:02.057760 30819 hybrid_clock.cc:648] HybridClock initialized: now 1750774682057693 us; error 78 us; skew 500 ppm
I20250624 14:18:02.058578 30819 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:02.065918 30819 webserver.cc:469] Webserver started at http://127.29.222.66:39137/ using document root <none> and password file <none>
I20250624 14:18:02.066891 30819 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:02.067113 30819 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:02.067584 30819 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:18:02.072129 30819 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/instance:
uuid: "974c4ecde5794a46989690774ea88ac5"
format_stamp: "Formatted at 2025-06-24 14:18:02 on dist-test-slave-6xgw"
I20250624 14:18:02.073249 30819 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal/instance:
uuid: "974c4ecde5794a46989690774ea88ac5"
format_stamp: "Formatted at 2025-06-24 14:18:02 on dist-test-slave-6xgw"
I20250624 14:18:02.080435 30819 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.008s sys 0.000s
I20250624 14:18:02.086652 30835 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:02.087707 30819 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.001s
I20250624 14:18:02.088032 30819 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
uuid: "974c4ecde5794a46989690774ea88ac5"
format_stamp: "Formatted at 2025-06-24 14:18:02 on dist-test-slave-6xgw"
I20250624 14:18:02.088337 30819 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:02.155835 30819 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:02.157312 30819 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:02.157847 30819 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:02.160900 30819 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 14:18:02.165089 30819 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 14:18:02.165305 30819 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:02.165598 30819 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 14:18:02.165752 30819 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:02.330981 30819 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.66:35417
I20250624 14:18:02.331087 30947 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.66:35417 every 8 connection(s)
I20250624 14:18:02.333482 30819 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
I20250624 14:18:02.336503 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 30819
I20250624 14:18:02.337036 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal/instance
I20250624 14:18:02.356930 30948 heartbeater.cc:344] Connected to a master server at 127.29.222.126:39907
I20250624 14:18:02.357448 30948 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:02.358713 30948 heartbeater.cc:507] Master 127.29.222.126:39907 requested a full tablet report, sending...
I20250624 14:18:02.360832 30626 ts_manager.cc:194] Registered new tserver with Master: 974c4ecde5794a46989690774ea88ac5 (127.29.222.66:35417)
I20250624 14:18:02.362251 30626 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.29.222.66:48777
I20250624 14:18:02.372709 30585 external_mini_cluster.cc:934] 2 TS(s) registered with all masters
I20250624 14:18:02.407828 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 30819
I20250624 14:18:02.431636 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 30594
I20250624 14:18:02.464939 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.29.222.126:39907
--webserver_interface=127.29.222.126
--webserver_port=37251
--builtin_ntp_servers=127.29.222.84:42211
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.29.222.126:39907
--master_tombstone_evicted_tablet_replicas=false with env {}
W20250624 14:18:02.478940 30815 heartbeater.cc:646] Failed to heartbeat to 127.29.222.126:39907 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.29.222.126:39907: connect: Connection refused (error 111)
W20250624 14:18:03.991334 30960 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:03.991935 30960 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:03.992337 30960 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:04.026206 30960 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 14:18:04.026494 30960 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:04.026705 30960 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 14:18:04.026963 30960 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 14:18:04.061841 30960 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:42211
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
--master_tombstone_evicted_tablet_replicas=false
--ipki_ca_key_size=768
--master_addresses=127.29.222.126:39907
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.29.222.126:39907
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.29.222.126
--webserver_port=37251
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:04.063126 30960 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:04.064704 30960 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:04.080277 30967 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:04.080279 30969 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:04.082664 30960 server_base.cc:1048] running on GCE node
W20250624 14:18:04.081552 30966 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:05.269073 30960 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:05.272274 30960 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:05.273725 30960 hybrid_clock.cc:648] HybridClock initialized: now 1750774685273694 us; error 50 us; skew 500 ppm
I20250624 14:18:05.274567 30960 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:05.286248 30960 webserver.cc:469] Webserver started at http://127.29.222.126:37251/ using document root <none> and password file <none>
I20250624 14:18:05.287292 30960 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:05.287541 30960 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:05.295439 30960 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.007s sys 0.001s
I20250624 14:18:05.300031 30977 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:05.301028 30960 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.004s sys 0.001s
I20250624 14:18:05.301350 30960 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
uuid: "d1211fbc127f4094b4f20ef60c0ef5f2"
format_stamp: "Formatted at 2025-06-24 14:17:58 on dist-test-slave-6xgw"
I20250624 14:18:05.303268 30960 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:05.357159 30960 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:05.358646 30960 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:05.359113 30960 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:05.429958 30960 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.126:39907
I20250624 14:18:05.430043 31028 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.126:39907 every 8 connection(s)
I20250624 14:18:05.432901 30960 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
I20250624 14:18:05.440804 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 30960
I20250624 14:18:05.445065 31029 sys_catalog.cc:263] Verifying existing consensus state
I20250624 14:18:05.450104 31029 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Bootstrap starting.
I20250624 14:18:05.488257 31029 log.cc:826] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:05.498934 30815 heartbeater.cc:344] Connected to a master server at 127.29.222.126:39907
I20250624 14:18:05.506450 31029 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Bootstrap replayed 1/1 log segments. Stats: ops{read=4 overwritten=0 applied=4 ignored=0} inserts{seen=3 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 14:18:05.507315 31029 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Bootstrap complete.
I20250624 14:18:05.526368 31029 raft_consensus.cc:357] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:18:05.528476 31029 raft_consensus.cc:738] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: d1211fbc127f4094b4f20ef60c0ef5f2, State: Initialized, Role: FOLLOWER
I20250624 14:18:05.529230 31029 consensus_queue.cc:260] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:18:05.529763 31029 raft_consensus.cc:397] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:18:05.530032 31029 raft_consensus.cc:491] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:18:05.530359 31029 raft_consensus.cc:3058] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 1 FOLLOWER]: Advancing to term 2
I20250624 14:18:05.536370 31029 raft_consensus.cc:513] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:18:05.537056 31029 leader_election.cc:304] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: d1211fbc127f4094b4f20ef60c0ef5f2; no voters:
I20250624 14:18:05.538631 31029 leader_election.cc:290] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [CANDIDATE]: Term 2 election: Requested vote from peers
I20250624 14:18:05.539161 31035 raft_consensus.cc:2802] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 14:18:05.541524 31035 raft_consensus.cc:695] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [term 2 LEADER]: Becoming Leader. State: Replica: d1211fbc127f4094b4f20ef60c0ef5f2, State: Running, Role: LEADER
I20250624 14:18:05.542286 31035 consensus_queue.cc:237] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 4, Committed index: 4, Last appended: 1.4, Last appended by leader: 4, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } }
I20250624 14:18:05.543278 31029 sys_catalog.cc:564] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 14:18:05.552822 31037 sys_catalog.cc:455] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: SysCatalogTable state changed. Reason: New leader d1211fbc127f4094b4f20ef60c0ef5f2. Latest consensus state: current_term: 2 leader_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } } }
I20250624 14:18:05.553071 31036 sys_catalog.cc:455] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "d1211fbc127f4094b4f20ef60c0ef5f2" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 39907 } } }
I20250624 14:18:05.553588 31037 sys_catalog.cc:458] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: This master's current role is: LEADER
I20250624 14:18:05.553607 31036 sys_catalog.cc:458] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2 [sys.catalog]: This master's current role is: LEADER
I20250624 14:18:05.557026 31044 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 14:18:05.577677 31044 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 14:18:05.581840 31044 catalog_manager.cc:1261] Loaded cluster ID: c7ebdd58ee8743d3aa26a8b0be53ecb1
I20250624 14:18:05.582047 31044 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 14:18:05.587064 31044 catalog_manager.cc:1506] Loading token signing keys...
I20250624 14:18:05.591048 31044 catalog_manager.cc:5966] T 00000000000000000000000000000000 P d1211fbc127f4094b4f20ef60c0ef5f2: Loaded TSK: 0
I20250624 14:18:05.592167 31044 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 14:18:06.507117 30994 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "4463b73640674948b4b479a5108fcc5b" instance_seqno: 1750774680390893) as {username='slave'} at 127.29.222.65:42573; Asking this server to re-register.
I20250624 14:18:06.508486 30815 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:06.509003 30815 heartbeater.cc:507] Master 127.29.222.126:39907 requested a full tablet report, sending...
I20250624 14:18:06.511091 30994 ts_manager.cc:194] Registered new tserver with Master: 4463b73640674948b4b479a5108fcc5b (127.29.222.65:40743)
I20250624 14:18:06.521939 30585 external_mini_cluster.cc:934] 1 TS(s) registered with all masters
I20250624 14:18:06.522333 30585 test_util.cc:276] Using random seed: -12215358
I20250624 14:18:06.574647 30994 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:52046:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 1
split_rows_range_bounds {
rows: "<redacted>""\004\001\000\377\377\377\037\004\001\000\376\377\377?\004\001\000\375\377\377_"
indirect_data: "<redacted>"""
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20250624 14:18:06.635747 30747 tablet_service.cc:1468] Processing CreateTablet for tablet 705fbe1c907446ac80413ff4bf8b232f (DEFAULT_TABLE table=test-workload [id=3c126ee5ec6c4916984c81e03e23f3a8]), partition=RANGE (key) PARTITION 1610612733 <= VALUES
I20250624 14:18:06.637854 30747 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 705fbe1c907446ac80413ff4bf8b232f. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:06.639397 30748 tablet_service.cc:1468] Processing CreateTablet for tablet 5916406c041948be9564a37bc4e7aa6f (DEFAULT_TABLE table=test-workload [id=3c126ee5ec6c4916984c81e03e23f3a8]), partition=RANGE (key) PARTITION 1073741822 <= VALUES < 1610612733
I20250624 14:18:06.640363 30748 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5916406c041948be9564a37bc4e7aa6f. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:06.643419 30749 tablet_service.cc:1468] Processing CreateTablet for tablet 9f669c7312324a9e9055c8e562e0a13d (DEFAULT_TABLE table=test-workload [id=3c126ee5ec6c4916984c81e03e23f3a8]), partition=RANGE (key) PARTITION 536870911 <= VALUES < 1073741822
I20250624 14:18:06.644503 30749 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9f669c7312324a9e9055c8e562e0a13d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:06.645596 30750 tablet_service.cc:1468] Processing CreateTablet for tablet a68c013880ac4723b781df095bf50a38 (DEFAULT_TABLE table=test-workload [id=3c126ee5ec6c4916984c81e03e23f3a8]), partition=RANGE (key) PARTITION VALUES < 536870911
I20250624 14:18:06.646490 30750 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a68c013880ac4723b781df095bf50a38. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:06.679586 31066 tablet_bootstrap.cc:492] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: Bootstrap starting.
I20250624 14:18:06.685689 31066 tablet_bootstrap.cc:654] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:06.687844 31066 log.cc:826] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:06.692945 31066 tablet_bootstrap.cc:492] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: No bootstrap required, opened a new log
I20250624 14:18:06.693380 31066 ts_tablet_manager.cc:1397] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: Time spent bootstrapping tablet: real 0.014s user 0.003s sys 0.008s
I20250624 14:18:06.711217 31066 raft_consensus.cc:357] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.711834 31066 raft_consensus.cc:383] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:06.712059 31066 raft_consensus.cc:738] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Initialized, Role: FOLLOWER
I20250624 14:18:06.712726 31066 consensus_queue.cc:260] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.713238 31066 raft_consensus.cc:397] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:18:06.713500 31066 raft_consensus.cc:491] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:18:06.713799 31066 raft_consensus.cc:3058] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:06.717929 31066 raft_consensus.cc:513] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.718626 31066 leader_election.cc:304] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4463b73640674948b4b479a5108fcc5b; no voters:
I20250624 14:18:06.720907 31066 leader_election.cc:290] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 14:18:06.721342 31068 raft_consensus.cc:2802] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:18:06.724670 31068 raft_consensus.cc:695] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [term 1 LEADER]: Becoming Leader. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Running, Role: LEADER
I20250624 14:18:06.725499 31068 consensus_queue.cc:237] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.726343 31066 ts_tablet_manager.cc:1428] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: Time spent starting tablet: real 0.033s user 0.025s sys 0.008s
I20250624 14:18:06.727202 31066 tablet_bootstrap.cc:492] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b: Bootstrap starting.
I20250624 14:18:06.736013 31066 tablet_bootstrap.cc:654] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:06.740962 30994 catalog_manager.cc:5582] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b reported cstate change: term changed from 0 to 1, leader changed from <none> to 4463b73640674948b4b479a5108fcc5b (127.29.222.65). New cstate: current_term: 1 leader_uuid: "4463b73640674948b4b479a5108fcc5b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } health_report { overall_health: HEALTHY } } }
I20250624 14:18:06.742959 31066 tablet_bootstrap.cc:492] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b: No bootstrap required, opened a new log
I20250624 14:18:06.743294 31066 ts_tablet_manager.cc:1397] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b: Time spent bootstrapping tablet: real 0.016s user 0.008s sys 0.005s
I20250624 14:18:06.745838 31066 raft_consensus.cc:357] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.746286 31066 raft_consensus.cc:383] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:06.746541 31066 raft_consensus.cc:738] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Initialized, Role: FOLLOWER
I20250624 14:18:06.747092 31066 consensus_queue.cc:260] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.747641 31066 raft_consensus.cc:397] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:18:06.747924 31066 raft_consensus.cc:491] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:18:06.748234 31066 raft_consensus.cc:3058] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:06.753883 31066 raft_consensus.cc:513] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.754455 31066 leader_election.cc:304] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4463b73640674948b4b479a5108fcc5b; no voters:
I20250624 14:18:06.754920 31066 leader_election.cc:290] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 14:18:06.755043 31068 raft_consensus.cc:2802] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:18:06.755458 31068 raft_consensus.cc:695] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [term 1 LEADER]: Becoming Leader. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Running, Role: LEADER
I20250624 14:18:06.756088 31068 consensus_queue.cc:237] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.756620 31066 ts_tablet_manager.cc:1428] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b: Time spent starting tablet: real 0.013s user 0.009s sys 0.003s
I20250624 14:18:06.757404 31066 tablet_bootstrap.cc:492] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b: Bootstrap starting.
I20250624 14:18:06.763228 31066 tablet_bootstrap.cc:654] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:06.763597 30993 catalog_manager.cc:5582] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b reported cstate change: term changed from 0 to 1, leader changed from <none> to 4463b73640674948b4b479a5108fcc5b (127.29.222.65). New cstate: current_term: 1 leader_uuid: "4463b73640674948b4b479a5108fcc5b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } health_report { overall_health: HEALTHY } } }
I20250624 14:18:06.769042 31066 tablet_bootstrap.cc:492] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b: No bootstrap required, opened a new log
I20250624 14:18:06.769402 31066 ts_tablet_manager.cc:1397] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b: Time spent bootstrapping tablet: real 0.012s user 0.010s sys 0.000s
I20250624 14:18:06.771932 31066 raft_consensus.cc:357] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.772532 31066 raft_consensus.cc:383] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:06.772868 31066 raft_consensus.cc:738] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Initialized, Role: FOLLOWER
I20250624 14:18:06.773444 31066 consensus_queue.cc:260] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.773873 31066 raft_consensus.cc:397] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:18:06.774085 31066 raft_consensus.cc:491] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:18:06.774322 31066 raft_consensus.cc:3058] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:06.778244 31066 raft_consensus.cc:513] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.778743 31066 leader_election.cc:304] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4463b73640674948b4b479a5108fcc5b; no voters:
I20250624 14:18:06.779203 31066 leader_election.cc:290] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 14:18:06.779361 31068 raft_consensus.cc:2802] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:18:06.779839 31068 raft_consensus.cc:695] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [term 1 LEADER]: Becoming Leader. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Running, Role: LEADER
I20250624 14:18:06.780454 31068 consensus_queue.cc:237] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.780833 31066 ts_tablet_manager.cc:1428] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b: Time spent starting tablet: real 0.011s user 0.008s sys 0.003s
I20250624 14:18:06.781499 31066 tablet_bootstrap.cc:492] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b: Bootstrap starting.
I20250624 14:18:06.786769 31066 tablet_bootstrap.cc:654] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:06.791810 30993 catalog_manager.cc:5582] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b reported cstate change: term changed from 0 to 1, leader changed from <none> to 4463b73640674948b4b479a5108fcc5b (127.29.222.65). New cstate: current_term: 1 leader_uuid: "4463b73640674948b4b479a5108fcc5b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } health_report { overall_health: HEALTHY } } }
I20250624 14:18:06.794660 31066 tablet_bootstrap.cc:492] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b: No bootstrap required, opened a new log
I20250624 14:18:06.795038 31066 ts_tablet_manager.cc:1397] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b: Time spent bootstrapping tablet: real 0.014s user 0.009s sys 0.004s
I20250624 14:18:06.797169 31066 raft_consensus.cc:357] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.797639 31066 raft_consensus.cc:383] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:06.797914 31066 raft_consensus.cc:738] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Initialized, Role: FOLLOWER
I20250624 14:18:06.798509 31066 consensus_queue.cc:260] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.799170 31066 raft_consensus.cc:397] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:18:06.799466 31066 raft_consensus.cc:491] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:18:06.799722 31066 raft_consensus.cc:3058] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:06.804587 31066 raft_consensus.cc:513] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.805087 31066 leader_election.cc:304] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 4463b73640674948b4b479a5108fcc5b; no voters:
I20250624 14:18:06.805727 31068 raft_consensus.cc:2802] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:18:06.806142 31068 raft_consensus.cc:695] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [term 1 LEADER]: Becoming Leader. State: Replica: 4463b73640674948b4b479a5108fcc5b, State: Running, Role: LEADER
I20250624 14:18:06.806936 31066 leader_election.cc:290] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 14:18:06.806758 31068 consensus_queue.cc:237] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:06.808835 31066 ts_tablet_manager.cc:1428] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b: Time spent starting tablet: real 0.014s user 0.011s sys 0.000s
I20250624 14:18:06.812942 30993 catalog_manager.cc:5582] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b reported cstate change: term changed from 0 to 1, leader changed from <none> to 4463b73640674948b4b479a5108fcc5b (127.29.222.65). New cstate: current_term: 1 leader_uuid: "4463b73640674948b4b479a5108fcc5b" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } health_report { overall_health: HEALTHY } } }
I20250624 14:18:11.274664 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.29.222.66:35417
--local_ip_for_outbound_sockets=127.29.222.66
--tserver_master_addrs=127.29.222.126:39907
--webserver_port=39137
--webserver_interface=127.29.222.66
--builtin_ntp_servers=127.29.222.84:42211
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--num_tablets_to_copy_simultaneously=1 with env {}
W20250624 14:18:11.613780 31100 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:11.614260 31100 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:11.614759 31100 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:11.647917 31100 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:11.648818 31100 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.29.222.66
I20250624 14:18:11.685840 31100 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:42211
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.29.222.66:35417
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.29.222.66
--webserver_port=39137
--tserver_master_addrs=127.29.222.126:39907
--num_tablets_to_copy_simultaneously=1
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.29.222.66
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:11.687191 31100 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:11.689571 31100 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:11.709563 31109 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:11.710327 31106 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:11.709965 31107 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:12.890190 31108 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 14:18:12.890285 31100 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 14:18:12.895355 31100 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:12.898097 31100 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:12.899556 31100 hybrid_clock.cc:648] HybridClock initialized: now 1750774692899512 us; error 56 us; skew 500 ppm
I20250624 14:18:12.900400 31100 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:12.908123 31100 webserver.cc:469] Webserver started at http://127.29.222.66:39137/ using document root <none> and password file <none>
I20250624 14:18:12.909080 31100 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:12.909327 31100 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:12.917686 31100 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.005s sys 0.000s
I20250624 14:18:12.922827 31116 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:12.923998 31100 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 14:18:12.924337 31100 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
uuid: "974c4ecde5794a46989690774ea88ac5"
format_stamp: "Formatted at 2025-06-24 14:18:02 on dist-test-slave-6xgw"
I20250624 14:18:12.926374 31100 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:12.999125 31100 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:13.000717 31100 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:13.001168 31100 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:13.003780 31100 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 14:18:13.008531 31100 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 14:18:13.008761 31100 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:13.009033 31100 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 14:18:13.009199 31100 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:13.193823 31100 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.66:35417
I20250624 14:18:13.193893 31228 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.66:35417 every 8 connection(s)
I20250624 14:18:13.196574 31100 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestTabletCopyThrottling.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
I20250624 14:18:13.198788 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 31100
I20250624 14:18:13.219957 31229 heartbeater.cc:344] Connected to a master server at 127.29.222.126:39907
I20250624 14:18:13.220414 31229 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:13.221478 31229 heartbeater.cc:507] Master 127.29.222.126:39907 requested a full tablet report, sending...
I20250624 14:18:13.223594 31235 ts_tablet_manager.cc:927] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Initiating tablet copy from peer 4463b73640674948b4b479a5108fcc5b (127.29.222.65:40743)
I20250624 14:18:13.223706 30986 ts_manager.cc:194] Registered new tserver with Master: 974c4ecde5794a46989690774ea88ac5 (127.29.222.66:35417)
I20250624 14:18:13.225955 30986 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.29.222.66:45841
I20250624 14:18:13.226096 31235 tablet_copy_client.cc:323] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Beginning tablet copy session from remote peer at address 127.29.222.65:40743
I20250624 14:18:13.236114 30790 tablet_copy_service.cc:140] P 4463b73640674948b4b479a5108fcc5b: Received BeginTabletCopySession request for tablet 5916406c041948be9564a37bc4e7aa6f from peer 974c4ecde5794a46989690774ea88ac5 ({username='slave'} at 127.29.222.66:39299)
I20250624 14:18:13.236656 30790 tablet_copy_service.cc:161] P 4463b73640674948b4b479a5108fcc5b: Beginning new tablet copy session on tablet 5916406c041948be9564a37bc4e7aa6f from peer 974c4ecde5794a46989690774ea88ac5 at {username='slave'} at 127.29.222.66:39299: session id = 974c4ecde5794a46989690774ea88ac5-5916406c041948be9564a37bc4e7aa6f
I20250624 14:18:13.241852 30790 tablet_copy_source_session.cc:215] T 5916406c041948be9564a37bc4e7aa6f P 4463b73640674948b4b479a5108fcc5b: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 14:18:13.247225 31235 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5916406c041948be9564a37bc4e7aa6f. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:13.265520 31235 tablet_copy_client.cc:806] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 0 data blocks...
I20250624 14:18:13.266049 31235 tablet_copy_client.cc:670] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 1 WAL segments...
I20250624 14:18:13.279084 31235 tablet_copy_client.cc:538] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 14:18:13.286752 31235 tablet_bootstrap.cc:492] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Bootstrap starting.
I20250624 14:18:13.429031 31235 log.cc:826] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:14.229400 31229 heartbeater.cc:499] Master 127.29.222.126:39907 was elected leader, sending a full tablet report...
I20250624 14:18:14.482563 31235 tablet_bootstrap.cc:492] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Bootstrap replayed 1/1 log segments. Stats: ops{read=211 overwritten=0 applied=211 ignored=0} inserts{seen=2644 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 14:18:14.483677 31235 tablet_bootstrap.cc:492] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Bootstrap complete.
I20250624 14:18:14.484532 31235 ts_tablet_manager.cc:1397] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Time spent bootstrapping tablet: real 1.198s user 1.154s sys 0.036s
I20250624 14:18:14.497308 31235 raft_consensus.cc:357] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:14.497954 31235 raft_consensus.cc:738] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 974c4ecde5794a46989690774ea88ac5, State: Initialized, Role: NON_PARTICIPANT
I20250624 14:18:14.498528 31235 consensus_queue.cc:260] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 211, Last appended: 1.211, Last appended by leader: 211, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:14.501398 31235 ts_tablet_manager.cc:1428] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Time spent starting tablet: real 0.016s user 0.015s sys 0.000s
I20250624 14:18:14.503008 30790 tablet_copy_service.cc:342] P 4463b73640674948b4b479a5108fcc5b: Request end of tablet copy session 974c4ecde5794a46989690774ea88ac5-5916406c041948be9564a37bc4e7aa6f received from {username='slave'} at 127.29.222.66:39299
I20250624 14:18:14.503387 30790 tablet_copy_service.cc:434] P 4463b73640674948b4b479a5108fcc5b: ending tablet copy session 974c4ecde5794a46989690774ea88ac5-5916406c041948be9564a37bc4e7aa6f on tablet 5916406c041948be9564a37bc4e7aa6f with peer 974c4ecde5794a46989690774ea88ac5
I20250624 14:18:14.505864 31235 ts_tablet_manager.cc:927] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Initiating tablet copy from peer 4463b73640674948b4b479a5108fcc5b (127.29.222.65:40743)
I20250624 14:18:14.507766 31235 tablet_copy_client.cc:323] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: tablet copy: Beginning tablet copy session from remote peer at address 127.29.222.65:40743
I20250624 14:18:14.508988 30790 tablet_copy_service.cc:140] P 4463b73640674948b4b479a5108fcc5b: Received BeginTabletCopySession request for tablet a68c013880ac4723b781df095bf50a38 from peer 974c4ecde5794a46989690774ea88ac5 ({username='slave'} at 127.29.222.66:39299)
I20250624 14:18:14.509354 30790 tablet_copy_service.cc:161] P 4463b73640674948b4b479a5108fcc5b: Beginning new tablet copy session on tablet a68c013880ac4723b781df095bf50a38 from peer 974c4ecde5794a46989690774ea88ac5 at {username='slave'} at 127.29.222.66:39299: session id = 974c4ecde5794a46989690774ea88ac5-a68c013880ac4723b781df095bf50a38
I20250624 14:18:14.513965 30790 tablet_copy_source_session.cc:215] T a68c013880ac4723b781df095bf50a38 P 4463b73640674948b4b479a5108fcc5b: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 14:18:14.516054 31235 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a68c013880ac4723b781df095bf50a38. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:14.523976 31235 tablet_copy_client.cc:806] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 0 data blocks...
I20250624 14:18:14.524402 31235 tablet_copy_client.cc:670] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 1 WAL segments...
I20250624 14:18:14.535754 31235 tablet_copy_client.cc:538] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 14:18:14.541074 31235 tablet_bootstrap.cc:492] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Bootstrap starting.
I20250624 14:18:15.632572 31235 tablet_bootstrap.cc:492] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Bootstrap replayed 1/1 log segments. Stats: ops{read=211 overwritten=0 applied=211 ignored=0} inserts{seen=2587 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 14:18:15.633265 31235 tablet_bootstrap.cc:492] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Bootstrap complete.
I20250624 14:18:15.633741 31235 ts_tablet_manager.cc:1397] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Time spent bootstrapping tablet: real 1.093s user 1.071s sys 0.020s
I20250624 14:18:15.635676 31235 raft_consensus.cc:357] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:15.636098 31235 raft_consensus.cc:738] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 974c4ecde5794a46989690774ea88ac5, State: Initialized, Role: NON_PARTICIPANT
I20250624 14:18:15.636482 31235 consensus_queue.cc:260] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 211, Last appended: 1.211, Last appended by leader: 211, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:15.639124 31235 ts_tablet_manager.cc:1428] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Time spent starting tablet: real 0.005s user 0.008s sys 0.000s
I20250624 14:18:15.640542 30790 tablet_copy_service.cc:342] P 4463b73640674948b4b479a5108fcc5b: Request end of tablet copy session 974c4ecde5794a46989690774ea88ac5-a68c013880ac4723b781df095bf50a38 received from {username='slave'} at 127.29.222.66:39299
I20250624 14:18:15.640854 30790 tablet_copy_service.cc:434] P 4463b73640674948b4b479a5108fcc5b: ending tablet copy session 974c4ecde5794a46989690774ea88ac5-a68c013880ac4723b781df095bf50a38 on tablet a68c013880ac4723b781df095bf50a38 with peer 974c4ecde5794a46989690774ea88ac5
W20250624 14:18:15.642778 31235 ts_tablet_manager.cc:726] T 5916406c041948be9564a37bc4e7aa6f P 974c4ecde5794a46989690774ea88ac5: Tablet Copy: Invalid argument: Leader has replica of tablet 5916406c041948be9564a37bc4e7aa6f with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250624 14:18:15.648396 31235 ts_tablet_manager.cc:927] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Initiating tablet copy from peer 4463b73640674948b4b479a5108fcc5b (127.29.222.65:40743)
I20250624 14:18:15.649889 31235 tablet_copy_client.cc:323] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Beginning tablet copy session from remote peer at address 127.29.222.65:40743
I20250624 14:18:15.651062 30790 tablet_copy_service.cc:140] P 4463b73640674948b4b479a5108fcc5b: Received BeginTabletCopySession request for tablet 705fbe1c907446ac80413ff4bf8b232f from peer 974c4ecde5794a46989690774ea88ac5 ({username='slave'} at 127.29.222.66:39299)
I20250624 14:18:15.651465 30790 tablet_copy_service.cc:161] P 4463b73640674948b4b479a5108fcc5b: Beginning new tablet copy session on tablet 705fbe1c907446ac80413ff4bf8b232f from peer 974c4ecde5794a46989690774ea88ac5 at {username='slave'} at 127.29.222.66:39299: session id = 974c4ecde5794a46989690774ea88ac5-705fbe1c907446ac80413ff4bf8b232f
I20250624 14:18:15.655859 30790 tablet_copy_source_session.cc:215] T 705fbe1c907446ac80413ff4bf8b232f P 4463b73640674948b4b479a5108fcc5b: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 14:18:15.657936 31235 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 705fbe1c907446ac80413ff4bf8b232f. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:15.665928 31235 tablet_copy_client.cc:806] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 0 data blocks...
I20250624 14:18:15.666347 31235 tablet_copy_client.cc:670] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 1 WAL segments...
I20250624 14:18:15.678345 31235 tablet_copy_client.cc:538] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 14:18:15.684245 31235 tablet_bootstrap.cc:492] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Bootstrap starting.
I20250624 14:18:16.778483 31235 tablet_bootstrap.cc:492] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Bootstrap replayed 1/1 log segments. Stats: ops{read=211 overwritten=0 applied=211 ignored=0} inserts{seen=2602 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 14:18:16.779305 31235 tablet_bootstrap.cc:492] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Bootstrap complete.
I20250624 14:18:16.779901 31235 ts_tablet_manager.cc:1397] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Time spent bootstrapping tablet: real 1.096s user 1.063s sys 0.032s
I20250624 14:18:16.781677 31235 raft_consensus.cc:357] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:16.782018 31235 raft_consensus.cc:738] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 974c4ecde5794a46989690774ea88ac5, State: Initialized, Role: NON_PARTICIPANT
I20250624 14:18:16.782413 31235 consensus_queue.cc:260] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 211, Last appended: 1.211, Last appended by leader: 211, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:16.784680 31235 ts_tablet_manager.cc:1428] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Time spent starting tablet: real 0.005s user 0.005s sys 0.000s
I20250624 14:18:16.786078 30790 tablet_copy_service.cc:342] P 4463b73640674948b4b479a5108fcc5b: Request end of tablet copy session 974c4ecde5794a46989690774ea88ac5-705fbe1c907446ac80413ff4bf8b232f received from {username='slave'} at 127.29.222.66:39299
I20250624 14:18:16.786471 30790 tablet_copy_service.cc:434] P 4463b73640674948b4b479a5108fcc5b: ending tablet copy session 974c4ecde5794a46989690774ea88ac5-705fbe1c907446ac80413ff4bf8b232f on tablet 705fbe1c907446ac80413ff4bf8b232f with peer 974c4ecde5794a46989690774ea88ac5
W20250624 14:18:16.789634 31235 ts_tablet_manager.cc:726] T a68c013880ac4723b781df095bf50a38 P 974c4ecde5794a46989690774ea88ac5: Tablet Copy: Invalid argument: Leader has replica of tablet a68c013880ac4723b781df095bf50a38 with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
W20250624 14:18:16.796545 31235 ts_tablet_manager.cc:726] T 705fbe1c907446ac80413ff4bf8b232f P 974c4ecde5794a46989690774ea88ac5: Tablet Copy: Invalid argument: Leader has replica of tablet 705fbe1c907446ac80413ff4bf8b232f with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250624 14:18:16.800297 31235 ts_tablet_manager.cc:927] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Initiating tablet copy from peer 4463b73640674948b4b479a5108fcc5b (127.29.222.65:40743)
I20250624 14:18:16.801347 31235 tablet_copy_client.cc:323] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: tablet copy: Beginning tablet copy session from remote peer at address 127.29.222.65:40743
I20250624 14:18:16.802585 30790 tablet_copy_service.cc:140] P 4463b73640674948b4b479a5108fcc5b: Received BeginTabletCopySession request for tablet 9f669c7312324a9e9055c8e562e0a13d from peer 974c4ecde5794a46989690774ea88ac5 ({username='slave'} at 127.29.222.66:39299)
I20250624 14:18:16.803013 30790 tablet_copy_service.cc:161] P 4463b73640674948b4b479a5108fcc5b: Beginning new tablet copy session on tablet 9f669c7312324a9e9055c8e562e0a13d from peer 974c4ecde5794a46989690774ea88ac5 at {username='slave'} at 127.29.222.66:39299: session id = 974c4ecde5794a46989690774ea88ac5-9f669c7312324a9e9055c8e562e0a13d
I20250624 14:18:16.807401 30790 tablet_copy_source_session.cc:215] T 9f669c7312324a9e9055c8e562e0a13d P 4463b73640674948b4b479a5108fcc5b: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 14:18:16.809648 31235 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9f669c7312324a9e9055c8e562e0a13d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:16.817538 31235 tablet_copy_client.cc:806] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 0 data blocks...
I20250624 14:18:16.817994 31235 tablet_copy_client.cc:670] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: tablet copy: Starting download of 1 WAL segments...
I20250624 14:18:16.829811 31235 tablet_copy_client.cc:538] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 14:18:16.835654 31235 tablet_bootstrap.cc:492] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Bootstrap starting.
I20250624 14:18:17.899173 31235 tablet_bootstrap.cc:492] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Bootstrap replayed 1/1 log segments. Stats: ops{read=211 overwritten=0 applied=211 ignored=0} inserts{seen=2667 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 14:18:17.900002 31235 tablet_bootstrap.cc:492] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Bootstrap complete.
I20250624 14:18:17.900614 31235 ts_tablet_manager.cc:1397] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Time spent bootstrapping tablet: real 1.065s user 1.055s sys 0.008s
I20250624 14:18:17.902213 31235 raft_consensus.cc:357] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:17.902684 31235 raft_consensus.cc:738] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5 [term 1 NON_PARTICIPANT]: Becoming Follower/Learner. State: Replica: 974c4ecde5794a46989690774ea88ac5, State: Initialized, Role: NON_PARTICIPANT
I20250624 14:18:17.903178 31235 consensus_queue.cc:260] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 211, Last appended: 1.211, Last appended by leader: 211, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "4463b73640674948b4b479a5108fcc5b" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 40743 } }
I20250624 14:18:17.905246 31235 ts_tablet_manager.cc:1428] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Time spent starting tablet: real 0.004s user 0.005s sys 0.000s
I20250624 14:18:17.906780 30790 tablet_copy_service.cc:342] P 4463b73640674948b4b479a5108fcc5b: Request end of tablet copy session 974c4ecde5794a46989690774ea88ac5-9f669c7312324a9e9055c8e562e0a13d received from {username='slave'} at 127.29.222.66:39299
I20250624 14:18:17.907147 30790 tablet_copy_service.cc:434] P 4463b73640674948b4b479a5108fcc5b: ending tablet copy session 974c4ecde5794a46989690774ea88ac5-9f669c7312324a9e9055c8e562e0a13d on tablet 9f669c7312324a9e9055c8e562e0a13d with peer 974c4ecde5794a46989690774ea88ac5
W20250624 14:18:17.909233 31235 ts_tablet_manager.cc:726] T 9f669c7312324a9e9055c8e562e0a13d P 974c4ecde5794a46989690774ea88ac5: Tablet Copy: Invalid argument: Leader has replica of tablet 9f669c7312324a9e9055c8e562e0a13d with term 0, which is lower than last-logged term 1 on local replica. Rejecting tablet copy request
I20250624 14:18:17.913779 30585 tablet_copy-itest.cc:1252] Number of Service unavailable responses: 1376
I20250624 14:18:17.914178 30585 tablet_copy-itest.cc:1253] Number of in progress responses: 922
I20250624 14:18:17.916510 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 30686
I20250624 14:18:17.970083 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 31100
I20250624 14:18:18.004326 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 30960
2025-06-24T14:18:18Z chronyd exiting
[ OK ] TabletCopyITest.TestTabletCopyThrottling (21425 ms)
[ RUN ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate
2025-06-24T14:18:18Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T14:18:18Z Disabled control of system clock
I20250624 14:18:18.105607 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.29.222.126:35313
--webserver_interface=127.29.222.126
--webserver_port=0
--builtin_ntp_servers=127.29.222.84:46113
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.29.222.126:35313 with env {}
W20250624 14:18:18.441140 31254 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:18.441776 31254 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:18.442250 31254 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:18.474007 31254 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 14:18:18.474320 31254 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:18.474650 31254 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 14:18:18.474920 31254 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 14:18:18.510836 31254 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:46113
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.29.222.126:35313
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.29.222.126:35313
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.29.222.126
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:18.512332 31254 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:18.514048 31254 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:18.529520 31260 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:18.530388 31261 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:18.532367 31263 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:18.532523 31254 server_base.cc:1048] running on GCE node
I20250624 14:18:19.712401 31254 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:19.715116 31254 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:19.716493 31254 hybrid_clock.cc:648] HybridClock initialized: now 1750774699716450 us; error 56 us; skew 500 ppm
I20250624 14:18:19.717566 31254 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:19.730316 31254 webserver.cc:469] Webserver started at http://127.29.222.126:34979/ using document root <none> and password file <none>
I20250624 14:18:19.731369 31254 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:19.731600 31254 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:19.732095 31254 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:18:19.736697 31254 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data/instance:
uuid: "80a408b7a98d46979491f0e4c8fa8e35"
format_stamp: "Formatted at 2025-06-24 14:18:19 on dist-test-slave-6xgw"
I20250624 14:18:19.737824 31254 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal/instance:
uuid: "80a408b7a98d46979491f0e4c8fa8e35"
format_stamp: "Formatted at 2025-06-24 14:18:19 on dist-test-slave-6xgw"
I20250624 14:18:19.745153 31254 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.009s sys 0.000s
I20250624 14:18:19.750783 31270 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:19.751905 31254 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.001s sys 0.002s
I20250624 14:18:19.752225 31254 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal
uuid: "80a408b7a98d46979491f0e4c8fa8e35"
format_stamp: "Formatted at 2025-06-24 14:18:19 on dist-test-slave-6xgw"
I20250624 14:18:19.752563 31254 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:19.847636 31254 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:19.849100 31254 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:19.849543 31254 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:19.923393 31254 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.126:35313
I20250624 14:18:19.923503 31321 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.126:35313 every 8 connection(s)
I20250624 14:18:19.926187 31254 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/data/info.pb
I20250624 14:18:19.932559 31322 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:19.934298 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 31254
I20250624 14:18:19.934728 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/master-0/wal/instance
I20250624 14:18:19.953413 31322 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35: Bootstrap starting.
I20250624 14:18:19.960028 31322 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:19.962075 31322 log.cc:826] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:19.966836 31322 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35: No bootstrap required, opened a new log
I20250624 14:18:19.984704 31322 raft_consensus.cc:357] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "80a408b7a98d46979491f0e4c8fa8e35" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 35313 } }
I20250624 14:18:19.985402 31322 raft_consensus.cc:383] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:19.985661 31322 raft_consensus.cc:738] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 80a408b7a98d46979491f0e4c8fa8e35, State: Initialized, Role: FOLLOWER
I20250624 14:18:19.986377 31322 consensus_queue.cc:260] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "80a408b7a98d46979491f0e4c8fa8e35" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 35313 } }
I20250624 14:18:19.986894 31322 raft_consensus.cc:397] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 14:18:19.987246 31322 raft_consensus.cc:491] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 14:18:19.987641 31322 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:19.991824 31322 raft_consensus.cc:513] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "80a408b7a98d46979491f0e4c8fa8e35" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 35313 } }
I20250624 14:18:19.992574 31322 leader_election.cc:304] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 80a408b7a98d46979491f0e4c8fa8e35; no voters:
I20250624 14:18:19.994225 31322 leader_election.cc:290] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 14:18:19.995052 31327 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:18:19.997301 31327 raft_consensus.cc:695] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [term 1 LEADER]: Becoming Leader. State: Replica: 80a408b7a98d46979491f0e4c8fa8e35, State: Running, Role: LEADER
I20250624 14:18:19.998090 31327 consensus_queue.cc:237] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "80a408b7a98d46979491f0e4c8fa8e35" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 35313 } }
I20250624 14:18:19.999168 31322 sys_catalog.cc:564] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 14:18:20.012192 31329 sys_catalog.cc:455] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 80a408b7a98d46979491f0e4c8fa8e35. Latest consensus state: current_term: 1 leader_uuid: "80a408b7a98d46979491f0e4c8fa8e35" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "80a408b7a98d46979491f0e4c8fa8e35" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 35313 } } }
I20250624 14:18:20.013070 31329 sys_catalog.cc:458] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [sys.catalog]: This master's current role is: LEADER
I20250624 14:18:20.012516 31328 sys_catalog.cc:455] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "80a408b7a98d46979491f0e4c8fa8e35" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "80a408b7a98d46979491f0e4c8fa8e35" member_type: VOTER last_known_addr { host: "127.29.222.126" port: 35313 } } }
I20250624 14:18:20.013501 31328 sys_catalog.cc:458] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35 [sys.catalog]: This master's current role is: LEADER
I20250624 14:18:20.016819 31334 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 14:18:20.029345 31334 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 14:18:20.048568 31334 catalog_manager.cc:1349] Generated new cluster ID: ee35f93f19f1485f818f35ad0b1bdafc
I20250624 14:18:20.048942 31334 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 14:18:20.069195 31334 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 14:18:20.070978 31334 catalog_manager.cc:1506] Loading token signing keys...
I20250624 14:18:20.083556 31334 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 80a408b7a98d46979491f0e4c8fa8e35: Generated new TSK 0
I20250624 14:18:20.084468 31334 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 14:18:20.092676 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.29.222.65:0
--local_ip_for_outbound_sockets=127.29.222.65
--webserver_interface=127.29.222.65
--webserver_port=0
--tserver_master_addrs=127.29.222.126:35313
--builtin_ntp_servers=127.29.222.84:46113
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
W20250624 14:18:20.451735 31346 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:20.452272 31346 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:20.452702 31346 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250624 14:18:20.452921 31346 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250624 14:18:20.453250 31346 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:20.486086 31346 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:20.487149 31346 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.29.222.65
I20250624 14:18:20.523553 31346 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:46113
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.29.222.65:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.29.222.65
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.29.222.126:35313
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.29.222.65
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:20.524925 31346 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:20.526700 31346 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:20.545009 31355 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:20.545554 31352 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:20.546317 31353 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:21.746523 31354 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 14:18:21.746665 31346 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 14:18:21.751068 31346 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:21.755506 31346 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:21.756984 31346 hybrid_clock.cc:648] HybridClock initialized: now 1750774701756923 us; error 90 us; skew 500 ppm
I20250624 14:18:21.757844 31346 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:21.764921 31346 webserver.cc:469] Webserver started at http://127.29.222.65:35645/ using document root <none> and password file <none>
I20250624 14:18:21.765879 31346 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:21.766074 31346 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:21.766528 31346 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:18:21.771133 31346 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data/instance:
uuid: "87d00425996f4bfe998a10be44790dcf"
format_stamp: "Formatted at 2025-06-24 14:18:21 on dist-test-slave-6xgw"
I20250624 14:18:21.772258 31346 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal/instance:
uuid: "87d00425996f4bfe998a10be44790dcf"
format_stamp: "Formatted at 2025-06-24 14:18:21 on dist-test-slave-6xgw"
I20250624 14:18:21.779997 31346 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.007s sys 0.001s
I20250624 14:18:21.786027 31362 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:21.787168 31346 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.001s
I20250624 14:18:21.787484 31346 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal
uuid: "87d00425996f4bfe998a10be44790dcf"
format_stamp: "Formatted at 2025-06-24 14:18:21 on dist-test-slave-6xgw"
I20250624 14:18:21.787783 31346 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:21.842958 31346 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:21.844398 31346 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:21.844831 31346 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:21.847329 31346 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 14:18:21.851349 31346 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 14:18:21.851557 31346 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:21.851838 31346 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 14:18:21.851985 31346 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:21.988819 31346 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.65:35103
I20250624 14:18:21.988941 31474 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.65:35103 every 8 connection(s)
I20250624 14:18:21.991451 31346 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/data/info.pb
I20250624 14:18:21.999739 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 31346
I20250624 14:18:22.000128 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-0/wal/instance
I20250624 14:18:22.006702 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.29.222.66:0
--local_ip_for_outbound_sockets=127.29.222.66
--webserver_interface=127.29.222.66
--webserver_port=0
--tserver_master_addrs=127.29.222.126:35313
--builtin_ntp_servers=127.29.222.84:46113
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
I20250624 14:18:22.018977 31475 heartbeater.cc:344] Connected to a master server at 127.29.222.126:35313
I20250624 14:18:22.019572 31475 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:22.021040 31475 heartbeater.cc:507] Master 127.29.222.126:35313 requested a full tablet report, sending...
I20250624 14:18:22.024331 31287 ts_manager.cc:194] Registered new tserver with Master: 87d00425996f4bfe998a10be44790dcf (127.29.222.65:35103)
I20250624 14:18:22.027294 31287 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.29.222.65:60733
W20250624 14:18:22.310964 31479 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:22.311462 31479 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:22.311933 31479 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250624 14:18:22.312140 31479 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250624 14:18:22.312455 31479 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:22.344007 31479 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:22.345002 31479 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.29.222.66
I20250624 14:18:22.379257 31479 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:46113
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.29.222.66:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.29.222.66
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.29.222.126:35313
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.29.222.66
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:22.380614 31479 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:22.382243 31479 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:22.396668 31485 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:23.031934 31475 heartbeater.cc:499] Master 127.29.222.126:35313 was elected leader, sending a full tablet report...
W20250624 14:18:22.396714 31486 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:23.594202 31487 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
W20250624 14:18:23.599296 31479 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.205s user 0.410s sys 0.794s
W20250624 14:18:23.599632 31479 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.206s user 0.410s sys 0.794s
W20250624 14:18:23.599686 31488 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:23.599881 31479 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 14:18:23.601079 31479 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:23.619294 31479 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:23.622900 31479 hybrid_clock.cc:648] HybridClock initialized: now 1750774703622803 us; error 72 us; skew 500 ppm
I20250624 14:18:23.623831 31479 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:23.631556 31479 webserver.cc:469] Webserver started at http://127.29.222.66:44357/ using document root <none> and password file <none>
I20250624 14:18:23.632534 31479 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:23.632766 31479 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:23.633198 31479 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:18:23.637732 31479 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data/instance:
uuid: "c58be4fc0d7748f29f44114198693091"
format_stamp: "Formatted at 2025-06-24 14:18:23 on dist-test-slave-6xgw"
I20250624 14:18:23.638816 31479 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal/instance:
uuid: "c58be4fc0d7748f29f44114198693091"
format_stamp: "Formatted at 2025-06-24 14:18:23 on dist-test-slave-6xgw"
I20250624 14:18:23.646221 31479 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.005s sys 0.001s
I20250624 14:18:23.652171 31495 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:23.653328 31479 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.002s
I20250624 14:18:23.653630 31479 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal
uuid: "c58be4fc0d7748f29f44114198693091"
format_stamp: "Formatted at 2025-06-24 14:18:23 on dist-test-slave-6xgw"
I20250624 14:18:23.653956 31479 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:23.732178 31479 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:23.733973 31479 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:23.734571 31479 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:23.737452 31479 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 14:18:23.741458 31479 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 14:18:23.741657 31479 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:23.741853 31479 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 14:18:23.741983 31479 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:23.874495 31479 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.66:34853
I20250624 14:18:23.874600 31607 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.66:34853 every 8 connection(s)
I20250624 14:18:23.876994 31479 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/data/info.pb
I20250624 14:18:23.881029 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 31479
I20250624 14:18:23.881532 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-1/wal/instance
I20250624 14:18:23.888120 30585 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskTXj651/build/tsan/bin/kudu
/tmp/dist-test-taskTXj651/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.29.222.67:0
--local_ip_for_outbound_sockets=127.29.222.67
--webserver_interface=127.29.222.67
--webserver_port=0
--tserver_master_addrs=127.29.222.126:35313
--builtin_ntp_servers=127.29.222.84:46113
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_flush_memrowset=false
--enable_flush_deltamemstores=false
--tablet_copy_download_threads_nums_per_session=4
--log_segment_size_mb=1 with env {}
I20250624 14:18:23.898007 31608 heartbeater.cc:344] Connected to a master server at 127.29.222.126:35313
I20250624 14:18:23.898444 31608 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:23.899580 31608 heartbeater.cc:507] Master 127.29.222.126:35313 requested a full tablet report, sending...
I20250624 14:18:23.901877 31287 ts_manager.cc:194] Registered new tserver with Master: c58be4fc0d7748f29f44114198693091 (127.29.222.66:34853)
I20250624 14:18:23.903769 31287 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.29.222.66:37957
W20250624 14:18:24.185657 31612 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 14:18:24.186121 31612 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 14:18:24.186491 31612 flags.cc:425] Enabled unsafe flag: --enable_flush_deltamemstores=false
W20250624 14:18:24.186655 31612 flags.cc:425] Enabled unsafe flag: --enable_flush_memrowset=false
W20250624 14:18:24.186990 31612 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 14:18:24.217882 31612 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 14:18:24.218808 31612 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.29.222.67
I20250624 14:18:24.253121 31612 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.29.222.84:46113
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_segment_size_mb=1
--fs_data_dirs=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.29.222.67:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.29.222.67
--webserver_port=0
--enable_flush_deltamemstores=false
--enable_flush_memrowset=false
--tserver_master_addrs=127.29.222.126:35313
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.29.222.67
--log_dir=/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 14:14:57 UTC on 24a791456cd2
build id 6749
TSAN enabled
I20250624 14:18:24.254345 31612 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 14:18:24.256006 31612 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 14:18:24.271548 31621 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:24.907027 31608 heartbeater.cc:499] Master 127.29.222.126:35313 was elected leader, sending a full tablet report...
W20250624 14:18:24.271567 31618 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 14:18:24.271572 31619 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 14:18:24.272845 31612 server_base.cc:1048] running on GCE node
I20250624 14:18:25.428556 31612 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 14:18:25.430928 31612 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 14:18:25.432272 31612 hybrid_clock.cc:648] HybridClock initialized: now 1750774705432254 us; error 49 us; skew 500 ppm
I20250624 14:18:25.433072 31612 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 14:18:25.444612 31612 webserver.cc:469] Webserver started at http://127.29.222.67:46597/ using document root <none> and password file <none>
I20250624 14:18:25.445567 31612 fs_manager.cc:362] Metadata directory not provided
I20250624 14:18:25.445777 31612 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 14:18:25.446245 31612 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 14:18:25.450721 31612 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data/instance:
uuid: "856f07fb0df0493397fac237d3228de8"
format_stamp: "Formatted at 2025-06-24 14:18:25 on dist-test-slave-6xgw"
I20250624 14:18:25.451949 31612 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal/instance:
uuid: "856f07fb0df0493397fac237d3228de8"
format_stamp: "Formatted at 2025-06-24 14:18:25 on dist-test-slave-6xgw"
I20250624 14:18:25.459012 31612 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.007s sys 0.000s
I20250624 14:18:25.464541 31628 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:25.465552 31612 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.005s sys 0.000s
I20250624 14:18:25.465865 31612 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data,/tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal
uuid: "856f07fb0df0493397fac237d3228de8"
format_stamp: "Formatted at 2025-06-24 14:18:25 on dist-test-slave-6xgw"
I20250624 14:18:25.466192 31612 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 14:18:25.516884 31612 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 14:18:25.518321 31612 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 14:18:25.518738 31612 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 14:18:25.521488 31612 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 14:18:25.525532 31612 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 14:18:25.525744 31612 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:25.525998 31612 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 14:18:25.526161 31612 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 14:18:25.660178 31612 rpc_server.cc:307] RPC server started. Bound to: 127.29.222.67:38441
I20250624 14:18:25.660290 31740 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.29.222.67:38441 every 8 connection(s)
I20250624 14:18:25.662709 31612 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/data/info.pb
I20250624 14:18:25.665900 30585 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskTXj651/build/tsan/bin/kudu as pid 31612
I20250624 14:18:25.666447 30585 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0/minicluster-data/ts-2/wal/instance
I20250624 14:18:25.685048 31741 heartbeater.cc:344] Connected to a master server at 127.29.222.126:35313
I20250624 14:18:25.685614 31741 heartbeater.cc:461] Registering TS with master...
I20250624 14:18:25.686674 31741 heartbeater.cc:507] Master 127.29.222.126:35313 requested a full tablet report, sending...
I20250624 14:18:25.688859 31287 ts_manager.cc:194] Registered new tserver with Master: 856f07fb0df0493397fac237d3228de8 (127.29.222.67:38441)
I20250624 14:18:25.690050 31287 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.29.222.67:38327
I20250624 14:18:25.702184 30585 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 14:18:25.729672 30585 test_util.cc:276] Using random seed: 6991977
I20250624 14:18:25.770125 31287 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:53598:
name: "test-workload"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
W20250624 14:18:25.772636 31287 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table test-workload in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 14:18:25.825220 31676 tablet_service.cc:1468] Processing CreateTablet for tablet 397f6370e03a41109ca554984e7cd2ad (DEFAULT_TABLE table=test-workload [id=c67479a9c5b54a9f958340aa62db06da]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 14:18:25.827490 31676 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 397f6370e03a41109ca554984e7cd2ad. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:25.828693 31543 tablet_service.cc:1468] Processing CreateTablet for tablet 397f6370e03a41109ca554984e7cd2ad (DEFAULT_TABLE table=test-workload [id=c67479a9c5b54a9f958340aa62db06da]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 14:18:25.830448 31543 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 397f6370e03a41109ca554984e7cd2ad. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:25.841589 31410 tablet_service.cc:1468] Processing CreateTablet for tablet 397f6370e03a41109ca554984e7cd2ad (DEFAULT_TABLE table=test-workload [id=c67479a9c5b54a9f958340aa62db06da]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 14:18:25.843609 31410 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 397f6370e03a41109ca554984e7cd2ad. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 14:18:25.855551 31765 tablet_bootstrap.cc:492] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091: Bootstrap starting.
I20250624 14:18:25.857777 31766 tablet_bootstrap.cc:492] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8: Bootstrap starting.
I20250624 14:18:25.863547 31765 tablet_bootstrap.cc:654] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:25.865413 31766 tablet_bootstrap.cc:654] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:25.866819 31765 log.cc:826] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:25.867802 31766 log.cc:826] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:25.870007 31767 tablet_bootstrap.cc:492] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: Bootstrap starting.
I20250624 14:18:25.875504 31765 tablet_bootstrap.cc:492] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091: No bootstrap required, opened a new log
I20250624 14:18:25.876899 31765 ts_tablet_manager.cc:1397] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091: Time spent bootstrapping tablet: real 0.022s user 0.010s sys 0.008s
I20250624 14:18:25.878477 31767 tablet_bootstrap.cc:654] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: Neither blocks nor log segments found. Creating new log.
I20250624 14:18:25.880566 31766 tablet_bootstrap.cc:492] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8: No bootstrap required, opened a new log
I20250624 14:18:25.881167 31766 ts_tablet_manager.cc:1397] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8: Time spent bootstrapping tablet: real 0.024s user 0.008s sys 0.013s
I20250624 14:18:25.883198 31767 log.cc:826] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: Log is configured to *not* fsync() on all Append() calls
I20250624 14:18:25.894762 31767 tablet_bootstrap.cc:492] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: No bootstrap required, opened a new log
I20250624 14:18:25.895459 31767 ts_tablet_manager.cc:1397] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: Time spent bootstrapping tablet: real 0.026s user 0.009s sys 0.012s
I20250624 14:18:25.905151 31765 raft_consensus.cc:357] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.906131 31765 raft_consensus.cc:383] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:25.906430 31765 raft_consensus.cc:738] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: c58be4fc0d7748f29f44114198693091, State: Initialized, Role: FOLLOWER
I20250624 14:18:25.907382 31765 consensus_queue.cc:260] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.910420 31766 raft_consensus.cc:357] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.911355 31766 raft_consensus.cc:383] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:25.911660 31766 raft_consensus.cc:738] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 856f07fb0df0493397fac237d3228de8, State: Initialized, Role: FOLLOWER
I20250624 14:18:25.912546 31766 consensus_queue.cc:260] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.922816 31765 ts_tablet_manager.cc:1428] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091: Time spent starting tablet: real 0.046s user 0.030s sys 0.010s
I20250624 14:18:25.931797 31741 heartbeater.cc:499] Master 127.29.222.126:35313 was elected leader, sending a full tablet report...
I20250624 14:18:25.933720 31766 ts_tablet_manager.cc:1428] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8: Time spent starting tablet: real 0.052s user 0.033s sys 0.012s
I20250624 14:18:25.934705 31767 raft_consensus.cc:357] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.935806 31767 raft_consensus.cc:383] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 14:18:25.936162 31767 raft_consensus.cc:738] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 87d00425996f4bfe998a10be44790dcf, State: Initialized, Role: FOLLOWER
I20250624 14:18:25.936790 31772 raft_consensus.cc:491] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 14:18:25.937112 31767 consensus_queue.cc:260] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.937319 31772 raft_consensus.cc:513] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.945346 31771 raft_consensus.cc:491] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 14:18:25.946283 31767 ts_tablet_manager.cc:1428] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: Time spent starting tablet: real 0.047s user 0.036s sys 0.000s
I20250624 14:18:25.945916 31771 raft_consensus.cc:513] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.952205 31771 leader_election.cc:290] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 856f07fb0df0493397fac237d3228de8 (127.29.222.67:38441), 87d00425996f4bfe998a10be44790dcf (127.29.222.65:35103)
I20250624 14:18:25.966055 31772 leader_election.cc:290] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers c58be4fc0d7748f29f44114198693091 (127.29.222.66:34853), 87d00425996f4bfe998a10be44790dcf (127.29.222.65:35103)
I20250624 14:18:25.974931 31563 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "856f07fb0df0493397fac237d3228de8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "c58be4fc0d7748f29f44114198693091" is_pre_election: true
I20250624 14:18:25.975813 31563 raft_consensus.cc:2466] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 856f07fb0df0493397fac237d3228de8 in term 0.
I20250624 14:18:25.977604 31632 leader_election.cc:304] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 856f07fb0df0493397fac237d3228de8, c58be4fc0d7748f29f44114198693091; no voters:
I20250624 14:18:25.979004 31772 raft_consensus.cc:2802] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 14:18:25.979564 31772 raft_consensus.cc:491] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 14:18:25.980010 31772 raft_consensus.cc:3058] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:25.987746 31430 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "856f07fb0df0493397fac237d3228de8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "87d00425996f4bfe998a10be44790dcf" is_pre_election: true
I20250624 14:18:25.988606 31430 raft_consensus.cc:2466] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 856f07fb0df0493397fac237d3228de8 in term 0.
I20250624 14:18:25.989843 31696 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "c58be4fc0d7748f29f44114198693091" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "856f07fb0df0493397fac237d3228de8" is_pre_election: true
I20250624 14:18:25.991189 31772 raft_consensus.cc:513] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:25.992486 31696 raft_consensus.cc:2391] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 FOLLOWER]: Leader pre-election vote request: Denying vote to candidate c58be4fc0d7748f29f44114198693091 in current term 1: Already voted for candidate 856f07fb0df0493397fac237d3228de8 in this term.
I20250624 14:18:25.996634 31430 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "c58be4fc0d7748f29f44114198693091" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "87d00425996f4bfe998a10be44790dcf" is_pre_election: true
I20250624 14:18:25.997177 31430 raft_consensus.cc:2466] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate c58be4fc0d7748f29f44114198693091 in term 0.
I20250624 14:18:25.998411 31496 leader_election.cc:304] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 87d00425996f4bfe998a10be44790dcf, c58be4fc0d7748f29f44114198693091; no voters:
I20250624 14:18:25.999661 31771 raft_consensus.cc:2802] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 14:18:26.000006 31771 raft_consensus.cc:491] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 14:18:26.000331 31771 raft_consensus.cc:3058] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 0 FOLLOWER]: Advancing to term 1
W20250624 14:18:26.000705 31476 tablet.cc:2378] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 14:18:26.001231 31476 tablet_replica_mm_ops.cc:240] Deltamemstore flush is disabled (check --enable_flush_deltamemstores)
W20250624 14:18:26.001473 31476 tablet_replica_mm_ops.cc:163] Memrowset flush is disabled (check --enable_flush_memrowset)
I20250624 14:18:26.001468 31563 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "856f07fb0df0493397fac237d3228de8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "c58be4fc0d7748f29f44114198693091"
I20250624 14:18:26.002547 31772 leader_election.cc:290] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [CANDIDATE]: Term 1 election: Requested vote from peers c58be4fc0d7748f29f44114198693091 (127.29.222.66:34853), 87d00425996f4bfe998a10be44790dcf (127.29.222.65:35103)
I20250624 14:18:26.004235 31430 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "856f07fb0df0493397fac237d3228de8" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "87d00425996f4bfe998a10be44790dcf"
I20250624 14:18:26.004770 31430 raft_consensus.cc:3058] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 0 FOLLOWER]: Advancing to term 1
I20250624 14:18:26.007560 31771 raft_consensus.cc:513] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:26.008672 31563 raft_consensus.cc:2391] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 1 FOLLOWER]: Leader election vote request: Denying vote to candidate 856f07fb0df0493397fac237d3228de8 in current term 1: Already voted for candidate c58be4fc0d7748f29f44114198693091 in this term.
I20250624 14:18:26.012948 31430 raft_consensus.cc:2466] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 856f07fb0df0493397fac237d3228de8 in term 1.
I20250624 14:18:26.014632 31629 leader_election.cc:304] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 856f07fb0df0493397fac237d3228de8, 87d00425996f4bfe998a10be44790dcf; no voters: c58be4fc0d7748f29f44114198693091
I20250624 14:18:26.015893 31771 leader_election.cc:290] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [CANDIDATE]: Term 1 election: Requested vote from peers 856f07fb0df0493397fac237d3228de8 (127.29.222.67:38441), 87d00425996f4bfe998a10be44790dcf (127.29.222.65:35103)
I20250624 14:18:26.016906 31772 raft_consensus.cc:2802] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 14:18:26.017827 31696 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "c58be4fc0d7748f29f44114198693091" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "856f07fb0df0493397fac237d3228de8"
I20250624 14:18:26.018676 31772 raft_consensus.cc:695] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 LEADER]: Becoming Leader. State: Replica: 856f07fb0df0493397fac237d3228de8, State: Running, Role: LEADER
I20250624 14:18:26.019886 31430 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "c58be4fc0d7748f29f44114198693091" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "87d00425996f4bfe998a10be44790dcf"
I20250624 14:18:26.019732 31772 consensus_queue.cc:237] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:26.020944 31430 raft_consensus.cc:2391] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 1 FOLLOWER]: Leader election vote request: Denying vote to candidate c58be4fc0d7748f29f44114198693091 in current term 1: Already voted for candidate 856f07fb0df0493397fac237d3228de8 in this term.
I20250624 14:18:26.026150 31498 leader_election.cc:304] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [CANDIDATE]: Term 1 election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: c58be4fc0d7748f29f44114198693091; no voters: 856f07fb0df0493397fac237d3228de8, 87d00425996f4bfe998a10be44790dcf
I20250624 14:18:26.027266 31771 raft_consensus.cc:2747] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 1 FOLLOWER]: Leader election lost for term 1. Reason: could not achieve majority
I20250624 14:18:26.030378 31287 catalog_manager.cc:5582] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 reported cstate change: term changed from 0 to 1, leader changed from <none> to 856f07fb0df0493397fac237d3228de8 (127.29.222.67). New cstate: current_term: 1 leader_uuid: "856f07fb0df0493397fac237d3228de8" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } health_report { overall_health: UNKNOWN } } }
I20250624 14:18:26.096041 31430 raft_consensus.cc:1273] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 1 FOLLOWER]: Refusing update from remote peer 856f07fb0df0493397fac237d3228de8: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 14:18:26.096516 31563 raft_consensus.cc:1273] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 1 FOLLOWER]: Refusing update from remote peer 856f07fb0df0493397fac237d3228de8: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 14:18:26.097373 31777 consensus_queue.cc:1035] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [LEADER]: Connected to new peer: Peer: permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.001s
I20250624 14:18:26.098095 31772 consensus_queue.cc:1035] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [LEADER]: Connected to new peer: Peer: permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20250624 14:18:26.133929 31609 tablet_replica_mm_ops.cc:240] Deltamemstore flush is disabled (check --enable_flush_deltamemstores)
W20250624 14:18:26.134313 31609 tablet_replica_mm_ops.cc:163] Memrowset flush is disabled (check --enable_flush_memrowset)
W20250624 14:18:26.167771 31742 tablet_replica_mm_ops.cc:240] Deltamemstore flush is disabled (check --enable_flush_deltamemstores)
W20250624 14:18:26.168056 31742 tablet_replica_mm_ops.cc:163] Memrowset flush is disabled (check --enable_flush_memrowset)
I20250624 14:18:26.177253 31430 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad"
dest_uuid: "87d00425996f4bfe998a10be44790dcf"
from {username='slave'} at 127.0.0.1:43426
I20250624 14:18:26.177754 31430 raft_consensus.cc:491] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20250624 14:18:26.178064 31430 raft_consensus.cc:3058] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 1 FOLLOWER]: Advancing to term 2
I20250624 14:18:26.182152 31430 raft_consensus.cc:513] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:26.184286 31430 leader_election.cc:290] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [CANDIDATE]: Term 2 election: Requested vote from peers 856f07fb0df0493397fac237d3228de8 (127.29.222.67:38441), c58be4fc0d7748f29f44114198693091 (127.29.222.66:34853)
I20250624 14:18:26.205525 31696 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "87d00425996f4bfe998a10be44790dcf" candidate_term: 2 candidate_status { last_received { term: 1 index: 2 } } ignore_live_leader: true dest_uuid: "856f07fb0df0493397fac237d3228de8"
I20250624 14:18:26.206156 31696 raft_consensus.cc:3053] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 LEADER]: Stepping down as leader of term 1
I20250624 14:18:26.206467 31696 raft_consensus.cc:738] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 856f07fb0df0493397fac237d3228de8, State: Running, Role: LEADER
I20250624 14:18:26.206621 31563 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "397f6370e03a41109ca554984e7cd2ad" candidate_uuid: "87d00425996f4bfe998a10be44790dcf" candidate_term: 2 candidate_status { last_received { term: 1 index: 2 } } ignore_live_leader: true dest_uuid: "c58be4fc0d7748f29f44114198693091"
I20250624 14:18:26.207163 31563 raft_consensus.cc:3058] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 1 FOLLOWER]: Advancing to term 2
I20250624 14:18:26.207265 31696 consensus_queue.cc:260] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 1.2, Last appended by leader: 2, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:26.208588 31696 raft_consensus.cc:3058] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 1 FOLLOWER]: Advancing to term 2
I20250624 14:18:26.214285 31563 raft_consensus.cc:2466] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 87d00425996f4bfe998a10be44790dcf in term 2.
I20250624 14:18:26.215764 31696 raft_consensus.cc:2466] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 87d00425996f4bfe998a10be44790dcf in term 2.
I20250624 14:18:26.215750 31366 leader_election.cc:304] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 87d00425996f4bfe998a10be44790dcf, c58be4fc0d7748f29f44114198693091; no voters:
I20250624 14:18:26.216791 31773 raft_consensus.cc:2802] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 2 FOLLOWER]: Leader election won for term 2
I20250624 14:18:26.219026 31773 raft_consensus.cc:695] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [term 2 LEADER]: Becoming Leader. State: Replica: 87d00425996f4bfe998a10be44790dcf, State: Running, Role: LEADER
I20250624 14:18:26.219995 31773 consensus_queue.cc:237] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 1.2, Last appended by leader: 2, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } }
I20250624 14:18:26.231242 31286 catalog_manager.cc:5582] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf reported cstate change: term changed from 1 to 2, leader changed from 856f07fb0df0493397fac237d3228de8 (127.29.222.67) to 87d00425996f4bfe998a10be44790dcf (127.29.222.65). New cstate: current_term: 2 leader_uuid: "87d00425996f4bfe998a10be44790dcf" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "87d00425996f4bfe998a10be44790dcf" member_type: VOTER last_known_addr { host: "127.29.222.65" port: 35103 } health_report { overall_health: HEALTHY } } }
W20250624 14:18:26.264752 31656 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:42560: Illegal state: replica 856f07fb0df0493397fac237d3228de8 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.276332 31656 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:42560: Illegal state: replica 856f07fb0df0493397fac237d3228de8 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.280725 31656 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:42560: Illegal state: replica 856f07fb0df0493397fac237d3228de8 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.281358 31655 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:42560: Illegal state: replica 856f07fb0df0493397fac237d3228de8 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.294320 31521 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.294677 31523 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.294060 31522 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.298509 31520 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.298923 31518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.300123 31519 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.303090 31517 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.303580 31521 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:51392: Illegal state: replica c58be4fc0d7748f29f44114198693091 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.304508 31656 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:42560: Illegal state: replica 856f07fb0df0493397fac237d3228de8 is not leader of this config: current role FOLLOWER
W20250624 14:18:26.309140 31656 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:42560: Illegal state: replica 856f07fb0df0493397fac237d3228de8 is not leader of this config: current role FOLLOWER
I20250624 14:18:26.332566 31696 raft_consensus.cc:1273] T 397f6370e03a41109ca554984e7cd2ad P 856f07fb0df0493397fac237d3228de8 [term 2 FOLLOWER]: Refusing update from remote peer 87d00425996f4bfe998a10be44790dcf: Log matching property violated. Preceding OpId in replica: term: 1 index: 2. Preceding OpId from leader: term: 2 index: 4. (index mismatch)
I20250624 14:18:26.332840 31563 raft_consensus.cc:1273] T 397f6370e03a41109ca554984e7cd2ad P c58be4fc0d7748f29f44114198693091 [term 2 FOLLOWER]: Refusing update from remote peer 87d00425996f4bfe998a10be44790dcf: Log matching property violated. Preceding OpId in replica: term: 1 index: 2. Preceding OpId from leader: term: 2 index: 4. (index mismatch)
I20250624 14:18:26.334074 31773 consensus_queue.cc:1035] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [LEADER]: Connected to new peer: Peer: permanent_uuid: "856f07fb0df0493397fac237d3228de8" member_type: VOTER last_known_addr { host: "127.29.222.67" port: 38441 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 3, Last known committed idx: 2, Time since last communication: 0.000s
I20250624 14:18:26.334962 31800 consensus_queue.cc:1035] T 397f6370e03a41109ca554984e7cd2ad P 87d00425996f4bfe998a10be44790dcf [LEADER]: Connected to new peer: Peer: permanent_uuid: "c58be4fc0d7748f29f44114198693091" member_type: VOTER last_known_addr { host: "127.29.222.66" port: 34853 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 3, Last known committed idx: 2, Time since last communication: 0.001s
I20250624 14:18:26.347401 31780 mvcc.cc:204] Tried to move back new op lower bound from 7171173197122523136 to 7171173196688334848. Current Snapshot: MvccSnapshot[applied={T|T < 7171173197122523136}]
I20250624 14:18:26.362623 31781 mvcc.cc:204] Tried to move back new op lower bound from 7171173197122523136 to 7171173196688334848. Current Snapshot: MvccSnapshot[applied={T|T < 7171173197122523136}]
I20250624 14:18:26.541425 31782 mvcc.cc:204] Tried to move back new op lower bound from 7171173197870882816 to 7171173196688334848. Current Snapshot: MvccSnapshot[applied={T|T < 7171173197870882816 or (T in {7171173197870882816})}]
W20250624 14:18:45.658756 31318 debug-util.cc:398] Leaking SignalData structure 0x7b080008bb20 after lost signal to thread 31255
W20250624 14:18:45.659533 31318 debug-util.cc:398] Leaking SignalData structure 0x7b08000a9280 after lost signal to thread 31321
W20250624 14:19:09.079731 31737 debug-util.cc:398] Leaking SignalData structure 0x7b0800041080 after lost signal to thread 31613
W20250624 14:19:09.080880 31737 debug-util.cc:398] Leaking SignalData structure 0x7b08000cab60 after lost signal to thread 31740
W20250624 14:19:13.781281 31318 debug-util.cc:398] Leaking SignalData structure 0x7b08000a3080 after lost signal to thread 31255
W20250624 14:19:13.781981 31318 debug-util.cc:398] Leaking SignalData structure 0x7b080009d3a0 after lost signal to thread 31321
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/tablet_copy-itest.cc:2151: Failure
Failed
Bad status: Timed out: Timed out waiting for number of WAL segments on tablet 397f6370e03a41109ca554984e7cd2ad on TS 0 to be 6. Found 5
I20250624 14:19:26.373417 30585 external_mini_cluster-itest-base.cc:80] Found fatal failure
I20250624 14:19:26.373883 30585 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 0 with UUID 87d00425996f4bfe998a10be44790dcf and pid 31346
************************ BEGIN STACKS **************************
[New LWP 31347]
[New LWP 31348]
[New LWP 31349]
[New LWP 31350]
[New LWP 31351]
[New LWP 31358]
[New LWP 31359]
[New LWP 31360]
[New LWP 31363]
[New LWP 31364]
[New LWP 31365]
[New LWP 31366]
[New LWP 31367]
[New LWP 31368]
[New LWP 31369]
[New LWP 31370]
[New LWP 31371]
[New LWP 31372]
[New LWP 31373]
[New LWP 31374]
[New LWP 31375]
[New LWP 31376]
[New LWP 31377]
[New LWP 31378]
[New LWP 31379]
[New LWP 31380]
[New LWP 31381]
[New LWP 31382]
[New LWP 31383]
[New LWP 31384]
[New LWP 31385]
[New LWP 31386]
[New LWP 31387]
[New LWP 31388]
[New LWP 31389]
[New LWP 31390]
[New LWP 31391]
[New LWP 31392]
[New LWP 31393]
[New LWP 31394]
[New LWP 31395]
[New LWP 31396]
[New LWP 31397]
[New LWP 31398]
[New LWP 31399]
[New LWP 31400]
[New LWP 31401]
[New LWP 31402]
[New LWP 31403]
[New LWP 31404]
[New LWP 31405]
[New LWP 31406]
[New LWP 31407]
[New LWP 31408]
[New LWP 31409]
[New LWP 31410]
[New LWP 31411]
[New LWP 31412]
[New LWP 31413]
[New LWP 31414]
[New LWP 31415]
[New LWP 31416]
[New LWP 31417]
[New LWP 31418]
[New LWP 31419]
[New LWP 31420]
[New LWP 31421]
[New LWP 31422]
[New LWP 31423]
[New LWP 31424]
[New LWP 31425]
[New LWP 31426]
[New LWP 31427]
[New LWP 31428]
[New LWP 31429]
[New LWP 31430]
[New LWP 31431]
[New LWP 31432]
[New LWP 31433]
[New LWP 31434]
[New LWP 31435]
[New LWP 31436]
[New LWP 31437]
[New LWP 31438]
[New LWP 31439]
[New LWP 31440]
[New LWP 31441]
[New LWP 31442]
[New LWP 31443]
[New LWP 31444]
[New LWP 31445]
[New LWP 31446]
[New LWP 31447]
[New LWP 31448]
[New LWP 31449]
[New LWP 31450]
[New LWP 31451]
[New LWP 31452]
[New LWP 31453]
[New LWP 31454]
[New LWP 31455]
[New LWP 31456]
[New LWP 31457]
[New LWP 31458]
[New LWP 31459]
[New LWP 31460]
[New LWP 31461]
[New LWP 31462]
[New LWP 31463]
[New LWP 31464]
[New LWP 31465]
[New LWP 31466]
[New LWP 31467]
[New LWP 31468]
[New LWP 31469]
[New LWP 31470]
[New LWP 31471]
[New LWP 31472]
[New LWP 31473]
[New LWP 31474]
[New LWP 31475]
[New LWP 31476]
[New LWP 31783]
[New LWP 31976]
Cannot access memory at address 0x18
Cannot access memory at address 0x10
Cannot access memory at address 0x18
Cannot access memory at address 0x18
Cannot access memory at address 0x10
0x00007fc2e3b38d50 in ?? ()
Id Target Id Frame
* 1 LWP 31346 "kudu" 0x00007fc2e3b38d50 in ?? ()
2 LWP 31347 "kudu" 0x00007fc2deefd7a0 in ?? ()
3 LWP 31348 "kudu" 0x00007fc2e3b34fb9 in ?? ()
4 LWP 31349 "kudu" 0x00007fc2e3b34fb9 in ?? ()
5 LWP 31350 "kudu" 0x00007fc2e3b34fb9 in ?? ()
6 LWP 31351 "kernel-watcher-" 0x00007fc2e3b34fb9 in ?? ()
7 LWP 31358 "ntp client-3135" 0x00007fc2e3b389e2 in ?? ()
8 LWP 31359 "file cache-evic" 0x00007fc2e3b34fb9 in ?? ()
9 LWP 31360 "sq_acceptor" 0x00007fc2def2dcb9 in ?? ()
10 LWP 31363 "rpc reactor-313" 0x00007fc2def3aa47 in ?? ()
11 LWP 31364 "rpc reactor-313" 0x00007fc2def3aa47 in ?? ()
12 LWP 31365 "rpc reactor-313" 0x00007fc2def3aa47 in ?? ()
13 LWP 31366 "rpc reactor-313" 0x00007fc2def3aa47 in ?? ()
14 LWP 31367 "MaintenanceMgr " 0x00007fc2e3b34ad3 in ?? ()
15 LWP 31368 "txn-status-mana" 0x00007fc2e3b34fb9 in ?? ()
16 LWP 31369 "collect_and_rem" 0x00007fc2e3b34fb9 in ?? ()
17 LWP 31370 "tc-session-exp-" 0x00007fc2e3b34fb9 in ?? ()
18 LWP 31371 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
19 LWP 31372 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
20 LWP 31373 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
21 LWP 31374 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
22 LWP 31375 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
23 LWP 31376 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
24 LWP 31377 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
25 LWP 31378 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
26 LWP 31379 "rpc worker-3137" 0x00007fc2e3b34ad3 in ?? ()
27 LWP 31380 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
28 LWP 31381 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
29 LWP 31382 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
30 LWP 31383 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
31 LWP 31384 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
32 LWP 31385 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
33 LWP 31386 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
34 LWP 31387 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
35 LWP 31388 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
36 LWP 31389 "rpc worker-3138" 0x00007fc2e3b34ad3 in ?? ()
37 LWP 31390 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
38 LWP 31391 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
39 LWP 31392 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
40 LWP 31393 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
41 LWP 31394 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
42 LWP 31395 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
43 LWP 31396 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
44 LWP 31397 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
45 LWP 31398 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
46 LWP 31399 "rpc worker-3139" 0x00007fc2e3b34ad3 in ?? ()
47 LWP 31400 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
48 LWP 31401 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
49 LWP 31402 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
50 LWP 31403 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
51 LWP 31404 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
52 LWP 31405 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
53 LWP 31406 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
54 LWP 31407 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
55 LWP 31408 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
56 LWP 31409 "rpc worker-3140" 0x00007fc2e3b34ad3 in ?? ()
57 LWP 31410 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
58 LWP 31411 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
59 LWP 31412 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
60 LWP 31413 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
61 LWP 31414 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
62 LWP 31415 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
63 LWP 31416 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
64 LWP 31417 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
65 LWP 31418 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
66 LWP 31419 "rpc worker-3141" 0x00007fc2e3b34ad3 in ?? ()
67 LWP 31420 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
68 LWP 31421 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
69 LWP 31422 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
70 LWP 31423 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
71 LWP 31424 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
72 LWP 31425 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
73 LWP 31426 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
74 LWP 31427 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
75 LWP 31428 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
76 LWP 31429 "rpc worker-3142" 0x00007fc2e3b34ad3 in ?? ()
77 LWP 31430 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
78 LWP 31431 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
79 LWP 31432 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
80 LWP 31433 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
81 LWP 31434 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
82 LWP 31435 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
83 LWP 31436 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
84 LWP 31437 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
85 LWP 31438 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
86 LWP 31439 "rpc worker-3143" 0x00007fc2e3b34ad3 in ?? ()
87 LWP 31440 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
88 LWP 31441 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
89 LWP 31442 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
90 LWP 31443 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
91 LWP 31444 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
92 LWP 31445 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
93 LWP 31446 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
94 LWP 31447 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
95 LWP 31448 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
96 LWP 31449 "rpc worker-3144" 0x00007fc2e3b34ad3 in ?? ()
97 LWP 31450 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
98 LWP 31451 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
99 LWP 31452 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
100 LWP 31453 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
101 LWP 31454 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
102 LWP 31455 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
103 LWP 31456 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
104 LWP 31457 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
105 LWP 31458 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
106 LWP 31459 "rpc worker-3145" 0x00007fc2e3b34ad3 in ?? ()
107 LWP 31460 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
108 LWP 31461 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
109 LWP 31462 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
110 LWP 31463 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
111 LWP 31464 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
112 LWP 31465 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
113 LWP 31466 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
114 LWP 31467 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
115 LWP 31468 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
116 LWP 31469 "rpc worker-3146" 0x00007fc2e3b34ad3 in ?? ()
117 LWP 31470 "rpc worker-3147" 0x00007fc2e3b34ad3 in ?? ()
118 LWP 31471 "diag-logger-314" 0x00007fc2e3b34fb9 in ?? ()
119 LWP 31472 "result-tracker-" 0x00007fc2e3b34fb9 in ?? ()
120 LWP 31473 "excess-log-dele" 0x00007fc2e3b34fb9 in ?? ()
121 LWP 31474 "acceptor-31474" 0x00007fc2def3c0c7 in ?? ()
122 LWP 31475 "heartbeat-31475" 0x00007fc2e3b34fb9 in ?? ()
123 LWP 31476 "maintenance_sch" 0x00007fc2e3b34fb9 in ?? ()
124 LWP 31783 "wal-append [wor" 0x00007fc2e3b34fb9 in ?? ()
125 LWP 31976 "raft [worker]-3" 0x00007fc2e3b34fb9 in ?? ()
Thread 125 (LWP 31976):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 124 (LWP 31783):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x00007b10000583f0 in ?? ()
#2 0x00000000000012ac in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b6400060018 in ?? ()
#5 0x00007fc294dbc440 in ?? ()
#6 0x0000000000002558 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 123 (LWP 31476):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x00007b0100000000 in ?? ()
#2 0x0000000000000104 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b54000028f0 in ?? ()
#5 0x00007fc297fb96c0 in ?? ()
#6 0x0000000000000208 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 31475):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 121 (LWP 31474):
#0 0x00007fc2def3c0c7 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 120 (LWP 31473):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x00007fc2997bc940 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007fff4626b180 in ?? ()
#5 0x00007fc2997bc7b0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 31472):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000085352fb8 in ?? ()
#2 0x0000000000000041 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b3400001008 in ?? ()
#5 0x00007fc299fbd800 in ?? ()
#6 0x0000000000000082 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 118 (LWP 31471):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x00007fc2dce88008 in ?? ()
#2 0x0000000000000041 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4000000c90 in ?? ()
#5 0x00007fc29a7be750 in ?? ()
#6 0x0000000000000082 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 31470):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 116 (LWP 31469):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 115 (LWP 31468):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 31467):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 31466):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 31465):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 31464):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 31463):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 31462):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 31461):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 31460):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 31459):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 31458):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 31457):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 31456):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 31455):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 31454):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 31453):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 31452):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 31451):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 31450):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 31449):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 31448):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 31447):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 31446):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 31445):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 31444):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 31443):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 31442):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 31441):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 31440):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 31439):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 31438):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 31437):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 31436):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 31435):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 31434):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 31433):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 31432):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 31431):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 31430):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000008 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b24001147c8 in ?? ()
#4 0x00007fc2af9ba710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2af9ba730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 76 (LWP 31429):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 75 (LWP 31428):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 31427):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 31426):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 31425):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 31424):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 31423):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 31422):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 31421):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 31420):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 31419):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 31418):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 31417):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 31416):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 31415):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 31414):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 31413):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 31412):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 31411):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 31410):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b24000b902c in ?? ()
#4 0x00007fc2b9dbc710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2b9dbc730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x007f0400000026c8 in ?? ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2b9dbc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 56 (LWP 31409):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 55 (LWP 31408):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 31407):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 31406):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 31405):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 31404):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 31403):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 31402):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 31401):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 31400):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 31399):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 31398):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 31397):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 31396):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 31395):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 31394):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 31393):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 31392):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 31391):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 31390):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x000000000000023c in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b240005ffe8 in ?? ()
#4 0x00007fc2c41be710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c41be730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 36 (LWP 31389):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240005d7fc in ?? ()
#4 0x00007fc2c4bb6710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c4bb6730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2c4bb6730 in ?? ()
#11 0x00007fc2dbf85c68 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 31388):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b2400058ffc in ?? ()
#4 0x00007fc2c53b7710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c53b7730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2c53b7730 in ?? ()
#11 0x00007fc2dbf7dc68 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 34 (LWP 31387):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b24000547fc in ?? ()
#4 0x00007fc2c5bb8710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c5bb8730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2c5bb8730 in ?? ()
#11 0x00007fc2dbf75c68 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 33 (LWP 31386):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240004fffc in ?? ()
#4 0x00007fc2c63b9710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c63b9730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2c63b9730 in ?? ()
#11 0x00007fc2dc4dcc68 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 32 (LWP 31385):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240004900c in ?? ()
#4 0x00007fc2c6bba710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c6bba730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2c6bba730 in ?? ()
#11 0x00007fc2dc4d4c68 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 31 (LWP 31384):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000268 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b2400044808 in ?? ()
#4 0x00007fc2c73bb710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c73bb730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 30 (LWP 31383):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x00000000000004e1 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240004000c in ?? ()
#4 0x00007fc2c7bbc710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fc2c7bbc730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2e3b34770 in ?? ()
#10 0x00007fc2c7bbc730 in ?? ()
#11 0x00007fc2bf9c36a0 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 29 (LWP 31382):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 31381):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 31380):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 31379):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 31378):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 31377):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 31376):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 31375):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 31374):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 31373):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 31372):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 31371):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 31370):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000017a335f0 in ?? ()
#2 0x0000000000000006 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4800003a00 in ?? ()
#5 0x00007fc2ce78e700 in ?? ()
#6 0x000000000000000c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 16 (LWP 31369):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x00007fc2cef8f9a8 in ?? ()
#2 0x000000000000000d in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b44000372d8 in ?? ()
#5 0x00007fc2cef8f840 in ?? ()
#6 0x000000000000001a in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 15 (LWP 31368):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000018 in ?? ()
#2 0x0000000000000006 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b5800000118 in ?? ()
#5 0x00007fc2cf790410 in ?? ()
#6 0x000000000000000c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 31367):
#0 0x00007fc2e3b34ad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 13 (LWP 31366):
#0 0x00007fc2def3aa47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 31365):
#0 0x00007fc2def3aa47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 11 (LWP 31364):
#0 0x00007fc2def3aa47 in ?? ()
#1 0x00007b280003a028 in ?? ()
#2 0x0040e000000aa05c in ?? ()
#3 0x00007fc2d1794500 in ?? ()
#4 0x00007fc2d1795b80 in ?? ()
#5 0x00007fc2d1794500 in ?? ()
#6 0x0000000000000010 in ?? ()
#7 0x00007b5800001200 in ?? ()
#8 0x0000000000488555 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007fc2dc6bc000 in ?? ()
#10 0x0000000000488459 in __sanitizer::internal_alloc_placeholder ()
#11 0x00007fc2d1795b80 in ?? ()
#12 0x00007fc2e1993069 in ?? ()
#13 0x00007b4c00000000 in ?? ()
#14 0x00007fc2e7113120 in ?? ()
#15 0x00007b4c000026d0 in ?? ()
#16 0x00007b4c000026d8 in ?? ()
#17 0x00007fc2d17947a0 in ?? ()
#18 0x00007b4400036b40 in ?? ()
#19 0x00007fc2d1794cd0 in ?? ()
#20 0x0000000000000000 in ?? ()
Thread 10 (LWP 31363):
#0 0x00007fc2def3aa47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 9 (LWP 31360):
#0 0x00007fc2def2dcb9 in ?? ()
#1 0x00007fc2d7dbcc10 in ?? ()
#2 0x00007b0400009510 in ?? ()
#3 0x00007fc2d7dbdb80 in ?? ()
#4 0x00007fc2d7dbcc10 in ?? ()
#5 0x00007b0400009510 in ?? ()
#6 0x0000000000488763 in __sanitizer::internal_alloc_placeholder ()
#7 0x00007fc2dc8a8000 in ?? ()
#8 0x0100000000000001 in ?? ()
#9 0x00007fc2d7dbdb80 in ?? ()
#10 0x00007fc2e89108b8 in ?? ()
#11 0x0000000000000000 in ?? ()
Thread 8 (LWP 31359):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 7 (LWP 31358):
#0 0x00007fc2e3b389e2 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 6 (LWP 31351):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x00007fc2d8dbea40 in ?? ()
#2 0x000000000000014b in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4400035b98 in ?? ()
#5 0x00007fc2d8dbe5d0 in ?? ()
#6 0x0000000000000296 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 5 (LWP 31350):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 4 (LWP 31349):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 3 (LWP 31348):
#0 0x00007fc2e3b34fb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 2 (LWP 31347):
#0 0x00007fc2deefd7a0 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 1 (LWP 31346):
#0 0x00007fc2e3b38d50 in ?? ()
#1 0x0000600001000078 in ?? ()
#2 0x00000000004679eb in __sanitizer::internal_alloc_placeholder ()
#3 0x00007fc2de15bcc0 in ?? ()
#4 0x00007fc2de15bcc0 in ?? ()
#5 0x00007fff4626af90 in ?? ()
#6 0x000000000048adb4 in __sanitizer::internal_alloc_placeholder ()
#7 0x0000600001000078 in ?? ()
#8 0x0000e00000a94876 in ?? ()
#9 0x00007fc2de15bcc0 in ?? ()
#10 0x00007fc2e2061ebb in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 14:19:27.390828 30585 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 1 with UUID c58be4fc0d7748f29f44114198693091 and pid 31479
************************ BEGIN STACKS **************************
[New LWP 31480]
[New LWP 31481]
[New LWP 31482]
[New LWP 31483]
[New LWP 31484]
[New LWP 31491]
[New LWP 31492]
[New LWP 31493]
[New LWP 31496]
[New LWP 31497]
[New LWP 31498]
[New LWP 31499]
[New LWP 31500]
[New LWP 31501]
[New LWP 31502]
[New LWP 31503]
[New LWP 31504]
[New LWP 31505]
[New LWP 31506]
[New LWP 31507]
[New LWP 31508]
[New LWP 31509]
[New LWP 31510]
[New LWP 31511]
[New LWP 31512]
[New LWP 31513]
[New LWP 31514]
[New LWP 31515]
[New LWP 31516]
[New LWP 31517]
[New LWP 31518]
[New LWP 31519]
[New LWP 31520]
[New LWP 31521]
[New LWP 31522]
[New LWP 31523]
[New LWP 31524]
[New LWP 31525]
[New LWP 31526]
[New LWP 31527]
[New LWP 31528]
[New LWP 31529]
[New LWP 31530]
[New LWP 31531]
[New LWP 31532]
[New LWP 31533]
[New LWP 31534]
[New LWP 31535]
[New LWP 31536]
[New LWP 31537]
[New LWP 31538]
[New LWP 31539]
[New LWP 31540]
[New LWP 31541]
[New LWP 31542]
[New LWP 31543]
[New LWP 31544]
[New LWP 31545]
[New LWP 31546]
[New LWP 31547]
[New LWP 31548]
[New LWP 31549]
[New LWP 31550]
[New LWP 31551]
[New LWP 31552]
[New LWP 31553]
[New LWP 31554]
[New LWP 31555]
[New LWP 31556]
[New LWP 31557]
[New LWP 31558]
[New LWP 31559]
[New LWP 31560]
[New LWP 31561]
[New LWP 31562]
[New LWP 31563]
[New LWP 31564]
[New LWP 31565]
[New LWP 31566]
[New LWP 31567]
[New LWP 31568]
[New LWP 31569]
[New LWP 31570]
[New LWP 31571]
[New LWP 31572]
[New LWP 31573]
[New LWP 31574]
[New LWP 31575]
[New LWP 31576]
[New LWP 31577]
[New LWP 31578]
[New LWP 31579]
[New LWP 31580]
[New LWP 31581]
[New LWP 31582]
[New LWP 31583]
[New LWP 31584]
[New LWP 31585]
[New LWP 31586]
[New LWP 31587]
[New LWP 31588]
[New LWP 31589]
[New LWP 31590]
[New LWP 31591]
[New LWP 31592]
[New LWP 31593]
[New LWP 31594]
[New LWP 31595]
[New LWP 31596]
[New LWP 31597]
[New LWP 31598]
[New LWP 31599]
[New LWP 31600]
[New LWP 31601]
[New LWP 31602]
[New LWP 31603]
[New LWP 31604]
[New LWP 31605]
[New LWP 31606]
[New LWP 31607]
[New LWP 31608]
[New LWP 31609]
Cannot access memory at address 0x18
Cannot access memory at address 0x10
Cannot access memory at address 0x18
Cannot access memory at address 0x18
Cannot access memory at address 0x10
0x00007f9e48ef2d50 in ?? ()
Id Target Id Frame
* 1 LWP 31479 "kudu" 0x00007f9e48ef2d50 in ?? ()
2 LWP 31480 "kudu" 0x00007f9e442b77a0 in ?? ()
3 LWP 31481 "kudu" 0x00007f9e48eeefb9 in ?? ()
4 LWP 31482 "kudu" 0x00007f9e48eeefb9 in ?? ()
5 LWP 31483 "kudu" 0x00007f9e48eeefb9 in ?? ()
6 LWP 31484 "kernel-watcher-" 0x00007f9e48eeefb9 in ?? ()
7 LWP 31491 "ntp client-3149" 0x00007f9e48ef29e2 in ?? ()
8 LWP 31492 "file cache-evic" 0x00007f9e48eeefb9 in ?? ()
9 LWP 31493 "sq_acceptor" 0x00007f9e442e7cb9 in ?? ()
10 LWP 31496 "rpc reactor-314" 0x00007f9e442f4a47 in ?? ()
11 LWP 31497 "rpc reactor-314" 0x00007f9e442f4a47 in ?? ()
12 LWP 31498 "rpc reactor-314" 0x00007f9e442f4a47 in ?? ()
13 LWP 31499 "rpc reactor-314" 0x00007f9e442f4a47 in ?? ()
14 LWP 31500 "MaintenanceMgr " 0x00007f9e48eeead3 in ?? ()
15 LWP 31501 "txn-status-mana" 0x00007f9e48eeefb9 in ?? ()
16 LWP 31502 "collect_and_rem" 0x00007f9e48eeefb9 in ?? ()
17 LWP 31503 "tc-session-exp-" 0x00007f9e48eeefb9 in ?? ()
18 LWP 31504 "rpc worker-3150" 0x00007f9e48eeead3 in ?? ()
19 LWP 31505 "rpc worker-3150" 0x00007f9e48eeead3 in ?? ()
20 LWP 31506 "rpc worker-3150" 0x00007f9e48eeead3 in ?? ()
21 LWP 31507 "rpc worker-3150" 0x00007f9e48eeead3 in ?? ()
22 LWP 31508 "rpc worker-3150" 0x00007f9e48eeead3 in ?? ()
23 LWP 31509 "rpc worker-3150" 0x00007f9e48eeead3 in ?? ()
24 LWP 31510 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
25 LWP 31511 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
26 LWP 31512 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
27 LWP 31513 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
28 LWP 31514 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
29 LWP 31515 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
30 LWP 31516 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
31 LWP 31517 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
32 LWP 31518 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
33 LWP 31519 "rpc worker-3151" 0x00007f9e48eeead3 in ?? ()
34 LWP 31520 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
35 LWP 31521 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
36 LWP 31522 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
37 LWP 31523 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
38 LWP 31524 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
39 LWP 31525 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
40 LWP 31526 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
41 LWP 31527 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
42 LWP 31528 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
43 LWP 31529 "rpc worker-3152" 0x00007f9e48eeead3 in ?? ()
44 LWP 31530 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
45 LWP 31531 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
46 LWP 31532 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
47 LWP 31533 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
48 LWP 31534 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
49 LWP 31535 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
50 LWP 31536 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
51 LWP 31537 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
52 LWP 31538 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
53 LWP 31539 "rpc worker-3153" 0x00007f9e48eeead3 in ?? ()
54 LWP 31540 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
55 LWP 31541 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
56 LWP 31542 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
57 LWP 31543 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
58 LWP 31544 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
59 LWP 31545 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
60 LWP 31546 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
61 LWP 31547 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
62 LWP 31548 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
63 LWP 31549 "rpc worker-3154" 0x00007f9e48eeead3 in ?? ()
64 LWP 31550 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
65 LWP 31551 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
66 LWP 31552 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
67 LWP 31553 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
68 LWP 31554 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
69 LWP 31555 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
70 LWP 31556 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
71 LWP 31557 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
72 LWP 31558 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
73 LWP 31559 "rpc worker-3155" 0x00007f9e48eeead3 in ?? ()
74 LWP 31560 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
75 LWP 31561 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
76 LWP 31562 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
77 LWP 31563 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
78 LWP 31564 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
79 LWP 31565 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
80 LWP 31566 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
81 LWP 31567 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
82 LWP 31568 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
83 LWP 31569 "rpc worker-3156" 0x00007f9e48eeead3 in ?? ()
84 LWP 31570 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
85 LWP 31571 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
86 LWP 31572 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
87 LWP 31573 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
88 LWP 31574 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
89 LWP 31575 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
90 LWP 31576 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
91 LWP 31577 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
92 LWP 31578 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
93 LWP 31579 "rpc worker-3157" 0x00007f9e48eeead3 in ?? ()
94 LWP 31580 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
95 LWP 31581 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
96 LWP 31582 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
97 LWP 31583 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
98 LWP 31584 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
99 LWP 31585 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
100 LWP 31586 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
101 LWP 31587 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
102 LWP 31588 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
103 LWP 31589 "rpc worker-3158" 0x00007f9e48eeead3 in ?? ()
104 LWP 31590 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
105 LWP 31591 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
106 LWP 31592 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
107 LWP 31593 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
108 LWP 31594 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
109 LWP 31595 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
110 LWP 31596 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
111 LWP 31597 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
112 LWP 31598 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
113 LWP 31599 "rpc worker-3159" 0x00007f9e48eeead3 in ?? ()
114 LWP 31600 "rpc worker-3160" 0x00007f9e48eeead3 in ?? ()
115 LWP 31601 "rpc worker-3160" 0x00007f9e48eeead3 in ?? ()
116 LWP 31602 "rpc worker-3160" 0x00007f9e48eeead3 in ?? ()
117 LWP 31603 "rpc worker-3160" 0x00007f9e48eeead3 in ?? ()
118 LWP 31604 "diag-logger-316" 0x00007f9e48eeefb9 in ?? ()
119 LWP 31605 "result-tracker-" 0x00007f9e48eeefb9 in ?? ()
120 LWP 31606 "excess-log-dele" 0x00007f9e48eeefb9 in ?? ()
121 LWP 31607 "acceptor-31607" 0x00007f9e442f60c7 in ?? ()
122 LWP 31608 "heartbeat-31608" 0x00007f9e48eeefb9 in ?? ()
123 LWP 31609 "maintenance_sch" 0x00007f9e48eeefb9 in ?? ()
Thread 123 (LWP 31609):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x00007b0100000000 in ?? ()
#2 0x0000000000000100 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b54000028f0 in ?? ()
#5 0x00007f9dfd2b96c0 in ?? ()
#6 0x0000000000000200 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 31608):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 121 (LWP 31607):
#0 0x00007f9e442f60c7 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 120 (LWP 31606):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x00007f9dfeabc940 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffdf443da40 in ?? ()
#5 0x00007f9dfeabc7b0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 31605):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000085352fb8 in ?? ()
#2 0x0000000000000040 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b3400001008 in ?? ()
#5 0x00007f9dff2bd800 in ?? ()
#6 0x0000000000000080 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 118 (LWP 31604):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x00007f9e42336008 in ?? ()
#2 0x0000000000000040 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4000000590 in ?? ()
#5 0x00007f9dffabe750 in ?? ()
#6 0x0000000000000080 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 31603):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 116 (LWP 31602):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 115 (LWP 31601):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 31600):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 31599):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 31598):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 31597):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 31596):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 31595):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 31594):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 31593):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 31592):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 31591):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 31590):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 31589):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 31588):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 31587):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 31586):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 31585):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 31584):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 31583):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 31582):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 31581):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 31580):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 31579):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 31578):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 31577):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 31576):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 31575):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 31574):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 31573):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 31572):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 31571):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 31570):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 31569):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 31568):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 31567):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 31566):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 31565):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 31564):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 31563):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x00000000000008f7 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b2400118fcc in ?? ()
#4 0x00007f9e14cba710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e14cba730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000000000000000 in ?? ()
Thread 76 (LWP 31562):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000996 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b24001147c8 in ?? ()
#4 0x00007f9e154bb710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e154bb730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 31561):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000035 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240010ffcc in ?? ()
#4 0x00007f9e15cbc710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e15cbc730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e15cbc730 in ?? ()
#11 0x00007f9df846e080 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 74 (LWP 31560):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 31559):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 31558):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 31557):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 31556):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 31555):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 31554):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 31553):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 31552):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 31551):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 31550):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 31549):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 31548):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 31547):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 31546):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 31545):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 31544):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 31543):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b24000c001c in ?? ()
#4 0x00007f9e1f0bc710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e1f0bc730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x007f0400000026c8 in ?? ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e1f0bc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 56 (LWP 31542):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 55 (LWP 31541):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 31540):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 31539):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 31538):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 31537):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 31536):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 31535):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 31534):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 31533):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 31532):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 31531):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 31530):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 31529):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 31528):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 31527):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 31526):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 31525):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 31524):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 31523):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b24000647e8 in ?? ()
#4 0x00007f9e294be710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e294be730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 36 (LWP 31522):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240005ffec in ?? ()
#4 0x00007f9e29eb6710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e29eb6730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e29eb6730 in ?? ()
#11 0x00007f9e41441c58 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 35 (LWP 31521):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000002 in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b240005d7f8 in ?? ()
#4 0x00007f9e2a6b7710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e2a6b7730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 34 (LWP 31520):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b2400058ffc in ?? ()
#4 0x00007f9e2aeb8710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e2aeb8730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e2aeb8730 in ?? ()
#11 0x00007f9e41431c58 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 33 (LWP 31519):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b24000547fc in ?? ()
#4 0x00007f9e2b6b9710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e2b6b9730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e2b6b9730 in ?? ()
#11 0x00007f9e41996c58 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 32 (LWP 31518):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240004fffc in ?? ()
#4 0x00007f9e2beba710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e2beba730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e2beba730 in ?? ()
#11 0x00007f9e4198ec58 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 31 (LWP 31517):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240004900c in ?? ()
#4 0x00007f9e2c6bb710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007f9e2c6bb730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x000000000045e389 in __sanitizer::internal_alloc_placeholder ()
#9 0x00007f9e48eee770 in ?? ()
#10 0x00007f9e2c6bb730 in ?? ()
#11 0x00007f9e41986c58 in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 30 (LWP 31516):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 31515):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 31514):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 31513):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 31512):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 31511):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 31510):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 31509):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 31508):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 31507):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 31506):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 31505):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 31504):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 31503):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000017a335f0 in ?? ()
#2 0x0000000000000006 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4800003a00 in ?? ()
#5 0x00007f9e33a8e700 in ?? ()
#6 0x000000000000000c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 16 (LWP 31502):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x00007f9e3428f9a8 in ?? ()
#2 0x000000000000000c in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b44000372d8 in ?? ()
#5 0x00007f9e3428f840 in ?? ()
#6 0x0000000000000018 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 15 (LWP 31501):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000000000018 in ?? ()
#2 0x0000000000000006 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b5800000118 in ?? ()
#5 0x00007f9e34a90410 in ?? ()
#6 0x000000000000000c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 31500):
#0 0x00007f9e48eeead3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 13 (LWP 31499):
#0 0x00007f9e442f4a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 31498):
#0 0x00007f9e442f4a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 11 (LWP 31497):
#0 0x00007f9e442f4a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 10 (LWP 31496):
#0 0x00007f9e442f4a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 9 (LWP 31493):
#0 0x00007f9e442e7cb9 in ?? ()
#1 0x00007f9e3d0bcc10 in ?? ()
#2 0x00007b0400009010 in ?? ()
#3 0x00007f9e3d0bdb80 in ?? ()
#4 0x00007f9e3d0bcc10 in ?? ()
#5 0x00007b0400009010 in ?? ()
#6 0x0000000000488763 in __sanitizer::internal_alloc_placeholder ()
#7 0x00007f9e41d9a000 in ?? ()
#8 0x0100000000000001 in ?? ()
#9 0x00007f9e3d0bdb80 in ?? ()
#10 0x00007f9e4dcca8b8 in ?? ()
#11 0x0000000000000000 in ?? ()
Thread 8 (LWP 31492):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 7 (LWP 31491):
#0 0x00007f9e48ef29e2 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 6 (LWP 31484):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x00007f9e3e0bea40 in ?? ()
#2 0x0000000000000142 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4400035b98 in ?? ()
#5 0x00007f9e3e0be5d0 in ?? ()
#6 0x0000000000000284 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 5 (LWP 31483):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 4 (LWP 31482):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 3 (LWP 31481):
#0 0x00007f9e48eeefb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 2 (LWP 31480):
#0 0x00007f9e442b77a0 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 1 (LWP 31479):
#0 0x00007f9e48ef2d50 in ?? ()
#1 0x0000600001000078 in ?? ()
#2 0x00000000004679eb in __sanitizer::internal_alloc_placeholder ()
#3 0x00007f9e43515cc0 in ?? ()
#4 0x00007f9e43515cc0 in ?? ()
#5 0x00007ffdf443d850 in ?? ()
#6 0x000000000048adb4 in __sanitizer::internal_alloc_placeholder ()
#7 0x0000600001000078 in ?? ()
#8 0x0000e00000a9e77a in ?? ()
#9 0x00007f9e43515cc0 in ?? ()
#10 0x00007f9e4741bebb in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 14:19:28.365217 30585 external_mini_cluster-itest-base.cc:86] Attempting to dump stacks of TS 2 with UUID 856f07fb0df0493397fac237d3228de8 and pid 31612
************************ BEGIN STACKS **************************
[New LWP 31613]
[New LWP 31614]
[New LWP 31615]
[New LWP 31616]
[New LWP 31617]
[New LWP 31624]
[New LWP 31625]
[New LWP 31626]
[New LWP 31629]
[New LWP 31630]
[New LWP 31631]
[New LWP 31632]
[New LWP 31633]
[New LWP 31634]
[New LWP 31635]
[New LWP 31636]
[New LWP 31637]
[New LWP 31638]
[New LWP 31639]
[New LWP 31640]
[New LWP 31641]
[New LWP 31642]
[New LWP 31643]
[New LWP 31644]
[New LWP 31645]
[New LWP 31646]
[New LWP 31647]
[New LWP 31648]
[New LWP 31649]
[New LWP 31650]
[New LWP 31651]
[New LWP 31652]
[New LWP 31653]
[New LWP 31654]
[New LWP 31655]
[New LWP 31656]
[New LWP 31657]
[New LWP 31658]
[New LWP 31659]
[New LWP 31660]
[New LWP 31661]
[New LWP 31662]
[New LWP 31663]
[New LWP 31664]
[New LWP 31665]
[New LWP 31666]
[New LWP 31667]
[New LWP 31668]
[New LWP 31669]
[New LWP 31670]
[New LWP 31671]
[New LWP 31672]
[New LWP 31673]
[New LWP 31674]
[New LWP 31675]
[New LWP 31676]
[New LWP 31677]
[New LWP 31678]
[New LWP 31679]
[New LWP 31680]
[New LWP 31681]
[New LWP 31682]
[New LWP 31683]
[New LWP 31684]
[New LWP 31685]
[New LWP 31686]
[New LWP 31687]
[New LWP 31688]
[New LWP 31689]
[New LWP 31690]
[New LWP 31691]
[New LWP 31692]
[New LWP 31693]
[New LWP 31694]
[New LWP 31695]
[New LWP 31696]
[New LWP 31697]
[New LWP 31698]
[New LWP 31699]
[New LWP 31700]
[New LWP 31701]
[New LWP 31702]
[New LWP 31703]
[New LWP 31704]
[New LWP 31705]
[New LWP 31706]
[New LWP 31707]
[New LWP 31708]
[New LWP 31709]
[New LWP 31710]
[New LWP 31711]
[New LWP 31712]
[New LWP 31713]
[New LWP 31714]
[New LWP 31715]
[New LWP 31716]
[New LWP 31717]
[New LWP 31718]
[New LWP 31719]
[New LWP 31720]
[New LWP 31721]
[New LWP 31722]
[New LWP 31723]
[New LWP 31724]
[New LWP 31725]
[New LWP 31726]
[New LWP 31727]
[New LWP 31728]
[New LWP 31729]
[New LWP 31730]
[New LWP 31731]
[New LWP 31732]
[New LWP 31733]
[New LWP 31734]
[New LWP 31735]
[New LWP 31736]
[New LWP 31737]
[New LWP 31738]
[New LWP 31739]
[New LWP 31740]
[New LWP 31741]
[New LWP 31742]
Cannot access memory at address 0x18
Cannot access memory at address 0x10
Cannot access memory at address 0x18
Cannot access memory at address 0x18
Cannot access memory at address 0x10
0x00007fabfa56ed50 in ?? ()
Id Target Id Frame
* 1 LWP 31612 "kudu" 0x00007fabfa56ed50 in ?? ()
2 LWP 31613 "kudu" 0x00007fabf59337a0 in ?? ()
3 LWP 31614 "kudu" 0x00007fabfa56afb9 in ?? ()
4 LWP 31615 "kudu" 0x00007fabfa56afb9 in ?? ()
5 LWP 31616 "kudu" 0x00007fabfa56afb9 in ?? ()
6 LWP 31617 "kernel-watcher-" 0x00007fabfa56afb9 in ?? ()
7 LWP 31624 "ntp client-3162" 0x00007fabfa56e9e2 in ?? ()
8 LWP 31625 "file cache-evic" 0x00007fabfa56afb9 in ?? ()
9 LWP 31626 "sq_acceptor" 0x00007fabf5963cb9 in ?? ()
10 LWP 31629 "rpc reactor-316" 0x00007fabf5970a47 in ?? ()
11 LWP 31630 "rpc reactor-316" 0x00007fabf5970a47 in ?? ()
12 LWP 31631 "rpc reactor-316" 0x00007fabf5970a47 in ?? ()
13 LWP 31632 "rpc reactor-316" 0x00007fabf5970a47 in ?? ()
14 LWP 31633 "MaintenanceMgr " 0x00007fabfa56aad3 in ?? ()
15 LWP 31634 "txn-status-mana" 0x00007fabfa56afb9 in ?? ()
16 LWP 31635 "collect_and_rem" 0x00007fabfa56afb9 in ?? ()
17 LWP 31636 "tc-session-exp-" 0x00007fabfa56afb9 in ?? ()
18 LWP 31637 "rpc worker-3163" 0x00007fabfa56aad3 in ?? ()
19 LWP 31638 "rpc worker-3163" 0x00007fabfa56aad3 in ?? ()
20 LWP 31639 "rpc worker-3163" 0x00007fabfa56aad3 in ?? ()
21 LWP 31640 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
22 LWP 31641 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
23 LWP 31642 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
24 LWP 31643 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
25 LWP 31644 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
26 LWP 31645 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
27 LWP 31646 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
28 LWP 31647 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
29 LWP 31648 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
30 LWP 31649 "rpc worker-3164" 0x00007fabfa56aad3 in ?? ()
31 LWP 31650 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
32 LWP 31651 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
33 LWP 31652 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
34 LWP 31653 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
35 LWP 31654 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
36 LWP 31655 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
37 LWP 31656 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
38 LWP 31657 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
39 LWP 31658 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
40 LWP 31659 "rpc worker-3165" 0x00007fabfa56aad3 in ?? ()
41 LWP 31660 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
42 LWP 31661 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
43 LWP 31662 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
44 LWP 31663 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
45 LWP 31664 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
46 LWP 31665 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
47 LWP 31666 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
48 LWP 31667 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
49 LWP 31668 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
50 LWP 31669 "rpc worker-3166" 0x00007fabfa56aad3 in ?? ()
51 LWP 31670 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
52 LWP 31671 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
53 LWP 31672 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
54 LWP 31673 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
55 LWP 31674 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
56 LWP 31675 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
57 LWP 31676 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
58 LWP 31677 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
59 LWP 31678 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
60 LWP 31679 "rpc worker-3167" 0x00007fabfa56aad3 in ?? ()
61 LWP 31680 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
62 LWP 31681 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
63 LWP 31682 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
64 LWP 31683 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
65 LWP 31684 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
66 LWP 31685 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
67 LWP 31686 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
68 LWP 31687 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
69 LWP 31688 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
70 LWP 31689 "rpc worker-3168" 0x00007fabfa56aad3 in ?? ()
71 LWP 31690 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
72 LWP 31691 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
73 LWP 31692 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
74 LWP 31693 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
75 LWP 31694 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
76 LWP 31695 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
77 LWP 31696 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
78 LWP 31697 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
79 LWP 31698 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
80 LWP 31699 "rpc worker-3169" 0x00007fabfa56aad3 in ?? ()
81 LWP 31700 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
82 LWP 31701 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
83 LWP 31702 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
84 LWP 31703 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
85 LWP 31704 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
86 LWP 31705 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
87 LWP 31706 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
88 LWP 31707 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
89 LWP 31708 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
90 LWP 31709 "rpc worker-3170" 0x00007fabfa56aad3 in ?? ()
91 LWP 31710 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
92 LWP 31711 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
93 LWP 31712 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
94 LWP 31713 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
95 LWP 31714 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
96 LWP 31715 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
97 LWP 31716 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
98 LWP 31717 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
99 LWP 31718 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
100 LWP 31719 "rpc worker-3171" 0x00007fabfa56aad3 in ?? ()
101 LWP 31720 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
102 LWP 31721 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
103 LWP 31722 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
104 LWP 31723 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
105 LWP 31724 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
106 LWP 31725 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
107 LWP 31726 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
108 LWP 31727 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
109 LWP 31728 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
110 LWP 31729 "rpc worker-3172" 0x00007fabfa56aad3 in ?? ()
111 LWP 31730 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
112 LWP 31731 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
113 LWP 31732 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
114 LWP 31733 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
115 LWP 31734 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
116 LWP 31735 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
117 LWP 31736 "rpc worker-3173" 0x00007fabfa56aad3 in ?? ()
118 LWP 31737 "diag-logger-317" 0x00007fabfa56afb9 in ?? ()
119 LWP 31738 "result-tracker-" 0x00007fabfa56afb9 in ?? ()
120 LWP 31739 "excess-log-dele" 0x00007fabfa56afb9 in ?? ()
121 LWP 31740 "acceptor-31740" 0x00007fabf59720c7 in ?? ()
122 LWP 31741 "heartbeat-31741" 0x00007fabfa56afb9 in ?? ()
123 LWP 31742 "maintenance_sch" 0x00007fabfa56afb9 in ?? ()
Thread 123 (LWP 31742):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x00007b0100000000 in ?? ()
#2 0x00000000000000fd in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b54000028f0 in ?? ()
#5 0x00007fabae9b96c0 in ?? ()
#6 0x00000000000001fa in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 122 (LWP 31741):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 121 (LWP 31740):
#0 0x00007fabf59720c7 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 120 (LWP 31739):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x00007fabb01bc940 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007ffe59ca6380 in ?? ()
#5 0x00007fabb01bc7b0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 119 (LWP 31738):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000085352fb8 in ?? ()
#2 0x000000000000003f in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b3400001008 in ?? ()
#5 0x00007fabb09bd800 in ?? ()
#6 0x000000000000007e in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 118 (LWP 31737):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x00007fabf3888008 in ?? ()
#2 0x000000000000003c in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4000000c90 in ?? ()
#5 0x00007fabb11be750 in ?? ()
#6 0x0000000000000078 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 117 (LWP 31736):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 116 (LWP 31735):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 115 (LWP 31734):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 114 (LWP 31733):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 113 (LWP 31732):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 112 (LWP 31731):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 111 (LWP 31730):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 110 (LWP 31729):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 109 (LWP 31728):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 108 (LWP 31727):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 107 (LWP 31726):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 106 (LWP 31725):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 105 (LWP 31724):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 104 (LWP 31723):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 103 (LWP 31722):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 102 (LWP 31721):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 101 (LWP 31720):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 100 (LWP 31719):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 99 (LWP 31718):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 98 (LWP 31717):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 97 (LWP 31716):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 96 (LWP 31715):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 95 (LWP 31714):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 94 (LWP 31713):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 93 (LWP 31712):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 92 (LWP 31711):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 91 (LWP 31710):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 90 (LWP 31709):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 89 (LWP 31708):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 88 (LWP 31707):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 87 (LWP 31706):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 86 (LWP 31705):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 85 (LWP 31704):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 84 (LWP 31703):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 83 (LWP 31702):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 82 (LWP 31701):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 81 (LWP 31700):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 80 (LWP 31699):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 79 (LWP 31698):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 78 (LWP 31697):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 77 (LWP 31696):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000a2c in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b24001147c8 in ?? ()
#4 0x00007fabc63ba710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fabc63ba730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 76 (LWP 31695):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x000000000000080e in ?? ()
#2 0x0000000000000081 in ?? ()
#3 0x00007b240010ffc8 in ?? ()
#4 0x00007fabc6bbb710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fabc6bbb730 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 75 (LWP 31694):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 74 (LWP 31693):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 73 (LWP 31692):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 72 (LWP 31691):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 71 (LWP 31690):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 70 (LWP 31689):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 69 (LWP 31688):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 68 (LWP 31687):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 67 (LWP 31686):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 66 (LWP 31685):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 65 (LWP 31684):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 64 (LWP 31683):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 63 (LWP 31682):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 62 (LWP 31681):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 61 (LWP 31680):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 60 (LWP 31679):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 59 (LWP 31678):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 58 (LWP 31677):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 57 (LWP 31676):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b24000b902c in ?? ()
#4 0x00007fabd07bc710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fabd07bc730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x007f0400000026c8 in ?? ()
#9 0x00007fabfa56a770 in ?? ()
#10 0x00007fabd07bc730 in ?? ()
#11 0x0002008300000dfe in ?? ()
#12 0x0000000000000000 in ?? ()
Thread 56 (LWP 31675):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 55 (LWP 31674):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 54 (LWP 31673):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 53 (LWP 31672):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 52 (LWP 31671):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 51 (LWP 31670):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 50 (LWP 31669):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 49 (LWP 31668):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 48 (LWP 31667):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 47 (LWP 31666):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 46 (LWP 31665):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 45 (LWP 31664):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 44 (LWP 31663):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 43 (LWP 31662):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 42 (LWP 31661):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 41 (LWP 31660):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 40 (LWP 31659):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 39 (LWP 31658):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 38 (LWP 31657):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 37 (LWP 31656):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000007 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240005ffec in ?? ()
#4 0x00007fabdabbe710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fabdabbe730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000000000000000 in ?? ()
Thread 36 (LWP 31655):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000001 in ?? ()
#2 0x0000000100000081 in ?? ()
#3 0x00007b240005d7fc in ?? ()
#4 0x00007fabdb5b6710 in ?? ()
#5 0x0000008000000000 in ?? ()
#6 0x00007fabdb5b6730 in ?? ()
#7 0x0000000000000001 in ?? ()
#8 0x0000000000000000 in ?? ()
Thread 35 (LWP 31654):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 34 (LWP 31653):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 33 (LWP 31652):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 32 (LWP 31651):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 31 (LWP 31650):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 30 (LWP 31649):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 29 (LWP 31648):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 28 (LWP 31647):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 27 (LWP 31646):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 26 (LWP 31645):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 25 (LWP 31644):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 24 (LWP 31643):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 23 (LWP 31642):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 22 (LWP 31641):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 21 (LWP 31640):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 20 (LWP 31639):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 19 (LWP 31638):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 18 (LWP 31637):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 17 (LWP 31636):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000017a335f0 in ?? ()
#2 0x0000000000000006 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4800003a00 in ?? ()
#5 0x00007fabe518e700 in ?? ()
#6 0x000000000000000c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 16 (LWP 31635):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x00007fabe598f9a8 in ?? ()
#2 0x000000000000000c in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b44000372d8 in ?? ()
#5 0x00007fabe598f840 in ?? ()
#6 0x0000000000000018 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 15 (LWP 31634):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000000000018 in ?? ()
#2 0x0000000000000006 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b5800000118 in ?? ()
#5 0x00007fabe6190410 in ?? ()
#6 0x000000000000000c in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 14 (LWP 31633):
#0 0x00007fabfa56aad3 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 13 (LWP 31632):
#0 0x00007fabf5970a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 12 (LWP 31631):
#0 0x00007fabf5970a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 11 (LWP 31630):
#0 0x00007fabf5970a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 10 (LWP 31629):
#0 0x00007fabf5970a47 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 9 (LWP 31626):
#0 0x00007fabf5963cb9 in ?? ()
#1 0x00007fabee7bcc10 in ?? ()
#2 0x00007b0400009010 in ?? ()
#3 0x00007fabee7bdb80 in ?? ()
#4 0x00007fabee7bcc10 in ?? ()
#5 0x00007b0400009010 in ?? ()
#6 0x0000000000488763 in __sanitizer::internal_alloc_placeholder ()
#7 0x00007fabf32cc000 in ?? ()
#8 0x0100000000000001 in ?? ()
#9 0x00007fabee7bdb80 in ?? ()
#10 0x00007fabff3468b8 in ?? ()
#11 0x0000000000000000 in ?? ()
Thread 8 (LWP 31625):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000600000000000 in ?? ()
#2 0x0000000000000001 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4400034018 in ?? ()
#5 0x00007fabedfbb7f0 in ?? ()
#6 0x0000000000000002 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 7 (LWP 31624):
#0 0x00007fabfa56e9e2 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 6 (LWP 31617):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x00007fabef7bea40 in ?? ()
#2 0x0000000000000142 in ?? ()
#3 0x0000000000000081 in ?? ()
#4 0x00007b4400035b98 in ?? ()
#5 0x00007fabef7be5d0 in ?? ()
#6 0x0000000000000284 in ?? ()
#7 0x0000000000000000 in ?? ()
Thread 5 (LWP 31616):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 4 (LWP 31615):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 3 (LWP 31614):
#0 0x00007fabfa56afb9 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 2 (LWP 31613):
#0 0x00007fabf59337a0 in ?? ()
#1 0x0000000000000000 in ?? ()
Thread 1 (LWP 31612):
#0 0x00007fabfa56ed50 in ?? ()
#1 0x0000600001000078 in ?? ()
#2 0x00000000004679eb in __sanitizer::internal_alloc_placeholder ()
#3 0x00007fabf4b91cc0 in ?? ()
#4 0x00007fabf4b91cc0 in ?? ()
#5 0x00007ffe59ca6190 in ?? ()
#6 0x000000000048adb4 in __sanitizer::internal_alloc_placeholder ()
#7 0x0000600001000078 in ?? ()
#8 0x0000e00000a97df3 in ?? ()
#9 0x00007fabf4b91cc0 in ?? ()
#10 0x00007fabf8a97ebb in ?? ()
#11 0x0000000000000000 in ?? ()
************************* END STACKS ***************************
I20250624 14:19:29.310609 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 31346
I20250624 14:19:29.363170 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 31479
I20250624 14:19:29.411909 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 31612
I20250624 14:19:29.462123 30585 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskTXj651/build/tsan/bin/kudu with pid 31254
2025-06-24T14:19:29Z chronyd exiting
I20250624 14:19:29.522231 30585 test_util.cc:183] -----------------------------------------------
I20250624 14:19:29.522457 30585 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskTXj651/test-tmp/tablet_copy-itest.0.TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate.1750774676460238-30585-0
[ FAILED ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate (71458 ms)
[----------] 4 tests from TabletCopyITest (92901 ms total)
[----------] 1 test from FaultFlags/BadTabletCopyITest
[ RUN ] FaultFlags/BadTabletCopyITest.TestBadCopy/1
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/integration-tests/tablet_copy-itest.cc:1510: Skipped
test is skipped; set KUDU_ALLOW_SLOW_TESTS=1 to run
[ SKIPPED ] FaultFlags/BadTabletCopyITest.TestBadCopy/1 (7 ms)
[----------] 1 test from FaultFlags/BadTabletCopyITest (7 ms total)
[----------] Global test environment tear-down
[==========] 5 tests from 2 test suites ran. (92910 ms total)
[ PASSED ] 1 test.
[ SKIPPED ] 3 tests, listed below:
[ SKIPPED ] TabletCopyITest.TestRejectRogueLeader
[ SKIPPED ] TabletCopyITest.TestDeleteLeaderDuringTabletCopyStressTest
[ SKIPPED ] FaultFlags/BadTabletCopyITest.TestBadCopy/1
[ FAILED ] 1 test, listed below:
[ FAILED ] TabletCopyITest.TestDownloadWalInParallelWithHeavyUpdate
1 FAILED TEST