Note: This is test shard 6 of 8.
[==========] Running 9 tests from 5 test suites.
[----------] Global test environment set-up.
[----------] 5 tests from AdminCliTest
[ RUN ] AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes
WARNING: Logging before InitGoogleLogging() is written to STDERR
I20250624 02:13:42.530121 31915 test_util.cc:276] Using random seed: -526534776
W20250624 02:13:43.772722 31915 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.195s user 0.429s sys 0.763s
W20250624 02:13:43.773183 31915 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.195s user 0.429s sys 0.763s
I20250624 02:13:43.775965 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:13:43.776247 31915 ts_itest-base.cc:116] --------------
I20250624 02:13:43.776454 31915 ts_itest-base.cc:117] 4 tablet servers
I20250624 02:13:43.776674 31915 ts_itest-base.cc:118] 3 replicas per TS
I20250624 02:13:43.776885 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:13:43Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:13:43Z Disabled control of system clock
I20250624 02:13:43.820158 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:36503
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:44861
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:36503 with env {}
W20250624 02:13:44.146412 31929 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:44.147033 31929 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:13:44.147495 31929 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:44.178879 31929 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:13:44.179201 31929 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:13:44.179466 31929 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:13:44.179710 31929 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:13:44.215250 31929 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44861
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:36503
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:36503
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:13:44.216576 31929 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:13:44.218276 31929 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:13:44.235591 31936 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:44.237421 31935 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:44.237542 31929 server_base.cc:1048] running on GCE node
W20250624 02:13:44.238377 31938 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:45.446491 31929 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:13:45.449738 31929 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:13:45.451159 31929 hybrid_clock.cc:648] HybridClock initialized: now 1750731225451125 us; error 75 us; skew 500 ppm
I20250624 02:13:45.452008 31929 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:13:45.459689 31929 webserver.cc:469] Webserver started at http://127.31.42.254:43685/ using document root <none> and password file <none>
I20250624 02:13:45.460777 31929 fs_manager.cc:362] Metadata directory not provided
I20250624 02:13:45.460994 31929 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:13:45.461467 31929 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:13:45.466097 31929 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "524d9ddd4367420893a4c22505825b22"
format_stamp: "Formatted at 2025-06-24 02:13:45 on dist-test-slave-5k9r"
I20250624 02:13:45.467281 31929 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "524d9ddd4367420893a4c22505825b22"
format_stamp: "Formatted at 2025-06-24 02:13:45 on dist-test-slave-5k9r"
I20250624 02:13:45.475113 31929 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.002s sys 0.005s
I20250624 02:13:45.481528 31945 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:45.482798 31929 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.001s
I20250624 02:13:45.483168 31929 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "524d9ddd4367420893a4c22505825b22"
format_stamp: "Formatted at 2025-06-24 02:13:45 on dist-test-slave-5k9r"
I20250624 02:13:45.483505 31929 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:13:45.537281 31929 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:13:45.538833 31929 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:13:45.539291 31929 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:13:45.627481 31929 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:36503
I20250624 02:13:45.627588 31996 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:36503 every 8 connection(s)
I20250624 02:13:45.630307 31929 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:13:45.635470 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 31929
I20250624 02:13:45.636025 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:13:45.637321 31997 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:13:45.659407 31997 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Bootstrap starting.
I20250624 02:13:45.666121 31997 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Neither blocks nor log segments found. Creating new log.
I20250624 02:13:45.668551 31997 log.cc:826] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Log is configured to *not* fsync() on all Append() calls
I20250624 02:13:45.675184 31997 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: No bootstrap required, opened a new log
I20250624 02:13:45.696094 31997 raft_consensus.cc:357] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:45.696763 31997 raft_consensus.cc:383] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:13:45.697014 31997 raft_consensus.cc:738] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 524d9ddd4367420893a4c22505825b22, State: Initialized, Role: FOLLOWER
I20250624 02:13:45.697705 31997 consensus_queue.cc:260] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:45.698248 31997 raft_consensus.cc:397] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:13:45.698523 31997 raft_consensus.cc:491] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:13:45.698835 31997 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:13:45.703372 31997 raft_consensus.cc:513] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:45.704125 31997 leader_election.cc:304] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 524d9ddd4367420893a4c22505825b22; no voters:
I20250624 02:13:45.706319 31997 leader_election.cc:290] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:13:45.707075 32002 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:13:45.710052 32002 raft_consensus.cc:695] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 LEADER]: Becoming Leader. State: Replica: 524d9ddd4367420893a4c22505825b22, State: Running, Role: LEADER
I20250624 02:13:45.711373 32002 consensus_queue.cc:237] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:45.713568 31997 sys_catalog.cc:564] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:13:45.725414 32003 sys_catalog.cc:455] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 524d9ddd4367420893a4c22505825b22. Latest consensus state: current_term: 1 leader_uuid: "524d9ddd4367420893a4c22505825b22" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } } }
I20250624 02:13:45.726938 32003 sys_catalog.cc:458] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: This master's current role is: LEADER
I20250624 02:13:45.726862 32004 sys_catalog.cc:455] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "524d9ddd4367420893a4c22505825b22" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } } }
I20250624 02:13:45.727587 32004 sys_catalog.cc:458] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: This master's current role is: LEADER
I20250624 02:13:45.731190 32011 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:13:45.745914 32011 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:13:45.770906 32011 catalog_manager.cc:1349] Generated new cluster ID: a84600010588424e828f8667c956016b
I20250624 02:13:45.771395 32011 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:13:45.805819 32011 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:13:45.807432 32011 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:13:45.829546 32011 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Generated new TSK 0
I20250624 02:13:45.830731 32011 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:13:45.850214 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--builtin_ntp_servers=127.31.42.212:44861
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
W20250624 02:13:46.187503 32021 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:46.188035 32021 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:13:46.188535 32021 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:46.220916 32021 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:13:46.221796 32021 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:13:46.258503 32021 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44861
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:13:46.259850 32021 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:13:46.261584 32021 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:13:46.282804 32028 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:46.285782 32027 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:46.287071 32030 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:46.288175 32021 server_base.cc:1048] running on GCE node
I20250624 02:13:47.477290 32021 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:13:47.480823 32021 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:13:47.482299 32021 hybrid_clock.cc:648] HybridClock initialized: now 1750731227482232 us; error 84 us; skew 500 ppm
I20250624 02:13:47.483170 32021 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:13:47.490558 32021 webserver.cc:469] Webserver started at http://127.31.42.193:40425/ using document root <none> and password file <none>
I20250624 02:13:47.491508 32021 fs_manager.cc:362] Metadata directory not provided
I20250624 02:13:47.491724 32021 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:13:47.492192 32021 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:13:47.496711 32021 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "11c448593b9b4152a40bf947653f01e2"
format_stamp: "Formatted at 2025-06-24 02:13:47 on dist-test-slave-5k9r"
I20250624 02:13:47.497869 32021 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "11c448593b9b4152a40bf947653f01e2"
format_stamp: "Formatted at 2025-06-24 02:13:47 on dist-test-slave-5k9r"
I20250624 02:13:47.505715 32021 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.003s sys 0.004s
I20250624 02:13:47.512900 32037 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:47.514107 32021 fs_manager.cc:730] Time spent opening block manager: real 0.005s user 0.007s sys 0.000s
I20250624 02:13:47.514492 32021 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "11c448593b9b4152a40bf947653f01e2"
format_stamp: "Formatted at 2025-06-24 02:13:47 on dist-test-slave-5k9r"
I20250624 02:13:47.514845 32021 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:13:47.578926 32021 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:13:47.580526 32021 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:13:47.581002 32021 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:13:47.584249 32021 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:13:47.588526 32021 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:13:47.588765 32021 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:47.589025 32021 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:13:47.589188 32021 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:47.763106 32021 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:42355
I20250624 02:13:47.763164 32149 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:42355 every 8 connection(s)
I20250624 02:13:47.765801 32021 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:13:47.768585 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 32021
I20250624 02:13:47.769155 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:13:47.776567 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--builtin_ntp_servers=127.31.42.212:44861
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:13:47.791749 32150 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
I20250624 02:13:47.792311 32150 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:47.793358 32150 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:47.796115 31962 ts_manager.cc:194] Registered new tserver with Master: 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355)
I20250624 02:13:47.798274 31962 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:58575
W20250624 02:13:48.107744 32154 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:48.108273 32154 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:13:48.108772 32154 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:48.140678 32154 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:13:48.141511 32154 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:13:48.176942 32154 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44861
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:13:48.178305 32154 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:13:48.179987 32154 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:13:48.197445 32161 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:48.802769 32150 heartbeater.cc:499] Master 127.31.42.254:36503 was elected leader, sending a full tablet report...
W20250624 02:13:48.198429 32160 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:49.415477 32163 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:49.418138 32162 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1217 milliseconds
W20250624 02:13:49.419750 32154 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.225s user 0.406s sys 0.809s
W20250624 02:13:49.420017 32154 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.226s user 0.406s sys 0.809s
I20250624 02:13:49.420228 32154 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:13:49.421307 32154 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:13:49.424242 32154 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:13:49.425609 32154 hybrid_clock.cc:648] HybridClock initialized: now 1750731229425576 us; error 55 us; skew 500 ppm
I20250624 02:13:49.426440 32154 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:13:49.435148 32154 webserver.cc:469] Webserver started at http://127.31.42.194:45301/ using document root <none> and password file <none>
I20250624 02:13:49.436726 32154 fs_manager.cc:362] Metadata directory not provided
I20250624 02:13:49.436975 32154 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:13:49.437422 32154 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:13:49.441900 32154 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/instance:
uuid: "61196268dddc47f0966f365f9829389c"
format_stamp: "Formatted at 2025-06-24 02:13:49 on dist-test-slave-5k9r"
I20250624 02:13:49.443132 32154 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance:
uuid: "61196268dddc47f0966f365f9829389c"
format_stamp: "Formatted at 2025-06-24 02:13:49 on dist-test-slave-5k9r"
I20250624 02:13:49.451308 32154 fs_manager.cc:696] Time spent creating directory manager: real 0.008s user 0.005s sys 0.002s
I20250624 02:13:49.457876 32170 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:49.459753 32154 fs_manager.cc:730] Time spent opening block manager: real 0.005s user 0.001s sys 0.002s
I20250624 02:13:49.460099 32154 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "61196268dddc47f0966f365f9829389c"
format_stamp: "Formatted at 2025-06-24 02:13:49 on dist-test-slave-5k9r"
I20250624 02:13:49.460439 32154 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:13:49.528779 32154 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:13:49.530368 32154 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:13:49.530891 32154 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:13:49.534021 32154 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:13:49.538185 32154 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:13:49.538405 32154 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:49.538655 32154 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:13:49.538812 32154 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:49.711252 32154 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:35929
I20250624 02:13:49.711354 32282 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:35929 every 8 connection(s)
I20250624 02:13:49.713842 32154 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:13:49.718947 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 32154
I20250624 02:13:49.719476 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance
I20250624 02:13:49.726291 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--builtin_ntp_servers=127.31.42.212:44861
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:13:49.737538 32283 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
I20250624 02:13:49.738000 32283 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:49.739117 32283 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:49.741906 31962 ts_manager.cc:194] Registered new tserver with Master: 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
I20250624 02:13:49.743937 31962 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:58107
W20250624 02:13:50.052881 32287 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:50.053349 32287 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:13:50.053802 32287 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:50.085522 32287 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:13:50.086371 32287 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:13:50.122748 32287 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44861
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:13:50.124048 32287 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:13:50.125720 32287 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:13:50.146118 32293 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:50.747546 32283 heartbeater.cc:499] Master 127.31.42.254:36503 was elected leader, sending a full tablet report...
W20250624 02:13:50.146622 32294 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:50.148010 32296 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:51.334892 32295 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:13:51.334776 32287 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:13:51.338985 32287 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:13:51.341799 32287 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:13:51.343279 32287 hybrid_clock.cc:648] HybridClock initialized: now 1750731231343253 us; error 46 us; skew 500 ppm
I20250624 02:13:51.344112 32287 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:13:51.355552 32287 webserver.cc:469] Webserver started at http://127.31.42.195:38319/ using document root <none> and password file <none>
I20250624 02:13:51.356534 32287 fs_manager.cc:362] Metadata directory not provided
I20250624 02:13:51.356771 32287 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:13:51.357290 32287 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:13:51.361821 32287 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/instance:
uuid: "3a7ccbc73c7243fd9e789330a7c77baf"
format_stamp: "Formatted at 2025-06-24 02:13:51 on dist-test-slave-5k9r"
I20250624 02:13:51.363037 32287 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance:
uuid: "3a7ccbc73c7243fd9e789330a7c77baf"
format_stamp: "Formatted at 2025-06-24 02:13:51 on dist-test-slave-5k9r"
I20250624 02:13:51.371244 32287 fs_manager.cc:696] Time spent creating directory manager: real 0.008s user 0.002s sys 0.005s
I20250624 02:13:51.377377 32303 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:51.378515 32287 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.001s
I20250624 02:13:51.378847 32287 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "3a7ccbc73c7243fd9e789330a7c77baf"
format_stamp: "Formatted at 2025-06-24 02:13:51 on dist-test-slave-5k9r"
I20250624 02:13:51.379168 32287 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:13:51.431929 32287 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:13:51.433462 32287 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:13:51.433969 32287 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:13:51.436956 32287 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:13:51.441224 32287 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:13:51.441442 32287 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:51.441663 32287 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:13:51.441797 32287 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:51.628681 32287 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:46295
I20250624 02:13:51.628899 32415 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:46295 every 8 connection(s)
I20250624 02:13:51.632272 32287 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:13:51.635306 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 32287
I20250624 02:13:51.635947 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance
I20250624 02:13:51.648023 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.196:0
--local_ip_for_outbound_sockets=127.31.42.196
--webserver_interface=127.31.42.196
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--builtin_ntp_servers=127.31.42.212:44861
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:13:51.670640 32416 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
I20250624 02:13:51.671177 32416 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:51.672462 32416 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:51.674937 31962 ts_manager.cc:194] Registered new tserver with Master: 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195:46295)
I20250624 02:13:51.676280 31962 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:51305
W20250624 02:13:51.989823 32420 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:51.990401 32420 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:13:51.990912 32420 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:52.023466 32420 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:13:52.024389 32420 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.196
I20250624 02:13:52.060251 32420 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44861
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.196:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/info.pb
--webserver_interface=127.31.42.196
--webserver_port=0
--tserver_master_addrs=127.31.42.254:36503
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.196
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:13:52.061506 32420 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:13:52.063186 32420 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:13:52.080925 32429 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:52.679606 32416 heartbeater.cc:499] Master 127.31.42.254:36503 was elected leader, sending a full tablet report...
W20250624 02:13:52.080928 32427 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:52.082258 32426 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:53.254581 32428 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:13:53.254622 32420 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:13:53.258769 32420 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:13:53.261714 32420 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:13:53.263227 32420 hybrid_clock.cc:648] HybridClock initialized: now 1750731233263193 us; error 79 us; skew 500 ppm
I20250624 02:13:53.264076 32420 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:13:53.270756 32420 webserver.cc:469] Webserver started at http://127.31.42.196:41733/ using document root <none> and password file <none>
I20250624 02:13:53.271735 32420 fs_manager.cc:362] Metadata directory not provided
I20250624 02:13:53.271971 32420 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:13:53.272432 32420 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:13:53.276994 32420 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/instance:
uuid: "293aefb746844e108ae0498be425a2ae"
format_stamp: "Formatted at 2025-06-24 02:13:53 on dist-test-slave-5k9r"
I20250624 02:13:53.278352 32420 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal/instance:
uuid: "293aefb746844e108ae0498be425a2ae"
format_stamp: "Formatted at 2025-06-24 02:13:53 on dist-test-slave-5k9r"
I20250624 02:13:53.285918 32420 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.002s sys 0.005s
I20250624 02:13:53.291807 32437 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:53.292955 32420 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.000s sys 0.005s
I20250624 02:13:53.293282 32420 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
uuid: "293aefb746844e108ae0498be425a2ae"
format_stamp: "Formatted at 2025-06-24 02:13:53 on dist-test-slave-5k9r"
I20250624 02:13:53.293627 32420 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:13:53.387611 32420 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:13:53.389093 32420 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:13:53.389536 32420 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:13:53.392264 32420 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:13:53.396574 32420 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:13:53.396771 32420 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:53.397040 32420 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:13:53.397189 32420 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:53.558161 32420 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.196:45233
I20250624 02:13:53.558260 32549 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.196:45233 every 8 connection(s)
I20250624 02:13:53.560818 32420 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/info.pb
I20250624 02:13:53.568413 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 32420
I20250624 02:13:53.569031 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal/instance
I20250624 02:13:53.582805 32550 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
I20250624 02:13:53.583251 32550 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:53.584317 32550 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:53.586532 31962 ts_manager.cc:194] Registered new tserver with Master: 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233)
I20250624 02:13:53.588027 31962 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.196:54173
I20250624 02:13:53.590274 31915 external_mini_cluster.cc:934] 4 TS(s) registered with all masters
I20250624 02:13:53.634032 31962 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:39128:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
owner: "alice"
I20250624 02:13:53.711107 32085 tablet_service.cc:1468] Processing CreateTablet for tablet 9cb47de9f2b743b4a245a41ae82cadd9 (DEFAULT_TABLE table=TestTable [id=22095834d7a64d8fbe014eb049e2e8bc]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:13:53.718107 32085 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9cb47de9f2b743b4a245a41ae82cadd9. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:13:53.722491 32485 tablet_service.cc:1468] Processing CreateTablet for tablet 9cb47de9f2b743b4a245a41ae82cadd9 (DEFAULT_TABLE table=TestTable [id=22095834d7a64d8fbe014eb049e2e8bc]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:13:53.724570 32485 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9cb47de9f2b743b4a245a41ae82cadd9. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:13:53.723811 32218 tablet_service.cc:1468] Processing CreateTablet for tablet 9cb47de9f2b743b4a245a41ae82cadd9 (DEFAULT_TABLE table=TestTable [id=22095834d7a64d8fbe014eb049e2e8bc]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:13:53.726171 32218 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9cb47de9f2b743b4a245a41ae82cadd9. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:13:53.752656 32569 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: Bootstrap starting.
I20250624 02:13:53.754700 32570 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Bootstrap starting.
I20250624 02:13:53.762187 32569 tablet_bootstrap.cc:654] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: Neither blocks nor log segments found. Creating new log.
I20250624 02:13:53.762995 32570 tablet_bootstrap.cc:654] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Neither blocks nor log segments found. Creating new log.
I20250624 02:13:53.764811 32569 log.cc:826] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: Log is configured to *not* fsync() on all Append() calls
I20250624 02:13:53.765561 32570 log.cc:826] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Log is configured to *not* fsync() on all Append() calls
I20250624 02:13:53.766364 32571 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: Bootstrap starting.
I20250624 02:13:53.774555 32571 tablet_bootstrap.cc:654] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: Neither blocks nor log segments found. Creating new log.
I20250624 02:13:53.780018 32571 log.cc:826] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: Log is configured to *not* fsync() on all Append() calls
I20250624 02:13:53.794204 32569 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: No bootstrap required, opened a new log
I20250624 02:13:53.794574 32570 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: No bootstrap required, opened a new log
I20250624 02:13:53.794836 32569 ts_tablet_manager.cc:1397] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: Time spent bootstrapping tablet: real 0.043s user 0.014s sys 0.027s
I20250624 02:13:53.795203 32570 ts_tablet_manager.cc:1397] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Time spent bootstrapping tablet: real 0.041s user 0.015s sys 0.023s
I20250624 02:13:53.795989 32571 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: No bootstrap required, opened a new log
I20250624 02:13:53.796586 32571 ts_tablet_manager.cc:1397] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: Time spent bootstrapping tablet: real 0.031s user 0.007s sys 0.019s
I20250624 02:13:53.819697 32570 raft_consensus.cc:357] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:53.820427 32570 raft_consensus.cc:383] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:13:53.820708 32570 raft_consensus.cc:738] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 61196268dddc47f0966f365f9829389c, State: Initialized, Role: FOLLOWER
I20250624 02:13:53.821494 32570 consensus_queue.cc:260] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:53.826179 32570 ts_tablet_manager.cc:1428] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Time spent starting tablet: real 0.031s user 0.026s sys 0.004s
I20250624 02:13:53.831141 32569 raft_consensus.cc:357] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:53.831856 32571 raft_consensus.cc:357] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:53.832784 32569 raft_consensus.cc:383] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:13:53.832746 32571 raft_consensus.cc:383] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:13:53.833055 32569 raft_consensus.cc:738] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 11c448593b9b4152a40bf947653f01e2, State: Initialized, Role: FOLLOWER
I20250624 02:13:53.833055 32571 raft_consensus.cc:738] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 293aefb746844e108ae0498be425a2ae, State: Initialized, Role: FOLLOWER
I20250624 02:13:53.833858 32569 consensus_queue.cc:260] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:53.834124 32571 consensus_queue.cc:260] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:53.840252 32550 heartbeater.cc:499] Master 127.31.42.254:36503 was elected leader, sending a full tablet report...
I20250624 02:13:53.840525 32569 ts_tablet_manager.cc:1428] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: Time spent starting tablet: real 0.045s user 0.027s sys 0.008s
I20250624 02:13:53.841343 32571 ts_tablet_manager.cc:1428] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: Time spent starting tablet: real 0.044s user 0.024s sys 0.013s
W20250624 02:13:53.974457 32284 tablet.cc:2378] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:13:54.008946 32576 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:13:54.009475 32576 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:54.012069 32576 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355), 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
I20250624 02:13:54.021229 32105 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "293aefb746844e108ae0498be425a2ae" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "11c448593b9b4152a40bf947653f01e2" is_pre_election: true
I20250624 02:13:54.022046 32105 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 293aefb746844e108ae0498be425a2ae in term 0.
I20250624 02:13:54.023288 32438 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 11c448593b9b4152a40bf947653f01e2, 293aefb746844e108ae0498be425a2ae; no voters:
I20250624 02:13:54.024245 32576 raft_consensus.cc:2802] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:13:54.024606 32576 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:13:54.024987 32576 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:13:54.024757 32238 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "293aefb746844e108ae0498be425a2ae" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "61196268dddc47f0966f365f9829389c" is_pre_election: true
I20250624 02:13:54.025652 32238 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 293aefb746844e108ae0498be425a2ae in term 0.
W20250624 02:13:54.030229 32151 tablet.cc:2378] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:13:54.030097 32576 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:54.031610 32576 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [CANDIDATE]: Term 1 election: Requested vote from peers 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355), 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
I20250624 02:13:54.032466 32105 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "293aefb746844e108ae0498be425a2ae" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "11c448593b9b4152a40bf947653f01e2"
I20250624 02:13:54.032501 32238 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "293aefb746844e108ae0498be425a2ae" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "61196268dddc47f0966f365f9829389c"
I20250624 02:13:54.032907 32105 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:13:54.032933 32238 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:13:54.037439 32105 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 293aefb746844e108ae0498be425a2ae in term 1.
I20250624 02:13:54.037439 32238 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 293aefb746844e108ae0498be425a2ae in term 1.
I20250624 02:13:54.038518 32440 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 293aefb746844e108ae0498be425a2ae, 61196268dddc47f0966f365f9829389c; no voters:
I20250624 02:13:54.039269 32576 raft_consensus.cc:2802] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:13:54.040838 32576 raft_consensus.cc:695] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [term 1 LEADER]: Becoming Leader. State: Replica: 293aefb746844e108ae0498be425a2ae, State: Running, Role: LEADER
I20250624 02:13:54.041771 32576 consensus_queue.cc:237] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:54.054723 31961 catalog_manager.cc:5582] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae reported cstate change: term changed from 0 to 1, leader changed from <none> to 293aefb746844e108ae0498be425a2ae (127.31.42.196). New cstate: current_term: 1 leader_uuid: "293aefb746844e108ae0498be425a2ae" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } health_report { overall_health: UNKNOWN } } }
W20250624 02:13:54.065924 32551 tablet.cc:2378] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:13:54.083225 31915 external_mini_cluster.cc:934] 4 TS(s) registered with all masters
I20250624 02:13:54.087258 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 11c448593b9b4152a40bf947653f01e2 to finish bootstrapping
I20250624 02:13:54.102224 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 61196268dddc47f0966f365f9829389c to finish bootstrapping
I20250624 02:13:54.114059 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 293aefb746844e108ae0498be425a2ae to finish bootstrapping
I20250624 02:13:54.127362 31915 kudu-admin-test.cc:709] Waiting for Master to see the current replicas...
I20250624 02:13:54.135849 31915 kudu-admin-test.cc:716] Tablet locations:
tablet_locations {
tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9"
DEPRECATED_stale: false
partition {
partition_key_start: ""
partition_key_end: ""
}
interned_replicas {
ts_info_idx: 0
role: FOLLOWER
}
interned_replicas {
ts_info_idx: 1
role: LEADER
}
interned_replicas {
ts_info_idx: 2
role: FOLLOWER
}
}
ts_infos {
permanent_uuid: "11c448593b9b4152a40bf947653f01e2"
rpc_addresses {
host: "127.31.42.193"
port: 42355
}
}
ts_infos {
permanent_uuid: "293aefb746844e108ae0498be425a2ae"
rpc_addresses {
host: "127.31.42.196"
port: 45233
}
}
ts_infos {
permanent_uuid: "61196268dddc47f0966f365f9829389c"
rpc_addresses {
host: "127.31.42.194"
port: 35929
}
}
I20250624 02:13:54.523993 32576 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [LEADER]: Connected to new peer: Peer: permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:13:54.549744 32576 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 293aefb746844e108ae0498be425a2ae [LEADER]: Connected to new peer: Peer: permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:13:54.597148 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 32420
W20250624 02:13:54.624603 31946 connection.cc:537] server connection from 127.31.42.196:54173 recv error: Network error: recv error from unknown peer: Transport endpoint is not connected (error 107)
I20250624 02:13:54.626361 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 31929
I20250624 02:13:54.655318 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:36503
--webserver_interface=127.31.42.254
--webserver_port=43685
--builtin_ntp_servers=127.31.42.212:44861
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:36503 with env {}
W20250624 02:13:54.711911 32416 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:36503 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:36503: connect: Connection refused (error 111)
W20250624 02:13:54.998448 32593 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:54.999019 32593 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:13:54.999435 32593 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:55.033232 32593 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:13:55.033567 32593 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:13:55.033833 32593 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:13:55.034160 32593 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:13:55.070482 32593 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44861
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:36503
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:36503
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=43685
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:13:55.071799 32593 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:13:55.073508 32593 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:13:55.088883 32601 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:13:55.547998 32279 debug-util.cc:398] Leaking SignalData structure 0x7b08000b2c20 after lost signal to thread 32155
W20250624 02:13:55.548904 32279 debug-util.cc:398] Leaking SignalData structure 0x7b0800036ee0 after lost signal to thread 32282
W20250624 02:13:55.615664 32150 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:36503 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:36503: connect: Connection refused (error 111)
W20250624 02:13:55.615540 32283 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:36503 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:36503: connect: Connection refused (error 111)
I20250624 02:13:56.070715 32609 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 293aefb746844e108ae0498be425a2ae)
I20250624 02:13:56.071314 32609 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:56.078717 32609 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233), 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
W20250624 02:13:56.079063 32038 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111)
I20250624 02:13:56.091080 32613 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 1 FOLLOWER]: Starting pre-election (detected failure of leader 293aefb746844e108ae0498be425a2ae)
I20250624 02:13:56.091840 32613 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
W20250624 02:13:56.094584 32038 leader_election.cc:336] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233): Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111)
I20250624 02:13:56.104192 32238 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "11c448593b9b4152a40bf947653f01e2" candidate_term: 2 candidate_status { last_received { term: 1 index: 1 } } ignore_live_leader: false dest_uuid: "61196268dddc47f0966f365f9829389c" is_pre_election: true
I20250624 02:13:56.104884 32238 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 11c448593b9b4152a40bf947653f01e2 in term 1.
I20250624 02:13:56.106873 32040 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 11c448593b9b4152a40bf947653f01e2, 61196268dddc47f0966f365f9829389c; no voters: 293aefb746844e108ae0498be425a2ae
I20250624 02:13:56.107976 32609 raft_consensus.cc:2802] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20250624 02:13:56.108330 32609 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 1 FOLLOWER]: Starting leader election (detected failure of leader 293aefb746844e108ae0498be425a2ae)
I20250624 02:13:56.108682 32609 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:13:56.112641 32613 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355), 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233)
I20250624 02:13:56.117621 32609 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
W20250624 02:13:56.120692 32171 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111)
I20250624 02:13:56.121165 32238 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "11c448593b9b4152a40bf947653f01e2" candidate_term: 2 candidate_status { last_received { term: 1 index: 1 } } ignore_live_leader: false dest_uuid: "61196268dddc47f0966f365f9829389c"
I20250624 02:13:56.121747 32238 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 1 FOLLOWER]: Advancing to term 2
W20250624 02:13:56.125202 32038 leader_election.cc:336] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233): Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111)
I20250624 02:13:56.130795 32238 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 11c448593b9b4152a40bf947653f01e2 in term 2.
I20250624 02:13:56.132134 32040 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 11c448593b9b4152a40bf947653f01e2, 61196268dddc47f0966f365f9829389c; no voters: 293aefb746844e108ae0498be425a2ae
I20250624 02:13:56.134063 32609 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [CANDIDATE]: Term 2 election: Requested vote from peers 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233), 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
I20250624 02:13:56.134850 32609 raft_consensus.cc:2802] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:13:56.135442 32609 raft_consensus.cc:695] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 2 LEADER]: Becoming Leader. State: Replica: 11c448593b9b4152a40bf947653f01e2, State: Running, Role: LEADER
W20250624 02:13:56.136883 32171 leader_election.cc:336] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233): Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111)
I20250624 02:13:56.136703 32609 consensus_queue.cc:237] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1, Committed index: 1, Last appended: 1.1, Last appended by leader: 1, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:56.166276 32105 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "61196268dddc47f0966f365f9829389c" candidate_term: 2 candidate_status { last_received { term: 1 index: 1 } } ignore_live_leader: false dest_uuid: "11c448593b9b4152a40bf947653f01e2" is_pre_election: true
I20250624 02:13:56.168551 32171 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 61196268dddc47f0966f365f9829389c; no voters: 11c448593b9b4152a40bf947653f01e2, 293aefb746844e108ae0498be425a2ae
I20250624 02:13:56.169914 32613 raft_consensus.cc:2747] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 2 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
W20250624 02:13:55.090010 32600 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:55.090654 32593 server_base.cc:1048] running on GCE node
W20250624 02:13:55.088956 32603 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:13:56.371023 32593 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:13:56.374291 32593 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:13:56.375753 32593 hybrid_clock.cc:648] HybridClock initialized: now 1750731236375714 us; error 55 us; skew 500 ppm
I20250624 02:13:56.376645 32593 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:13:56.388324 32593 webserver.cc:469] Webserver started at http://127.31.42.254:43685/ using document root <none> and password file <none>
I20250624 02:13:56.389364 32593 fs_manager.cc:362] Metadata directory not provided
I20250624 02:13:56.389608 32593 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:13:56.398250 32593 fs_manager.cc:714] Time spent opening directory manager: real 0.006s user 0.007s sys 0.000s
I20250624 02:13:56.403374 32624 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:13:56.404531 32593 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.002s
I20250624 02:13:56.404892 32593 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "524d9ddd4367420893a4c22505825b22"
format_stamp: "Formatted at 2025-06-24 02:13:45 on dist-test-slave-5k9r"
I20250624 02:13:56.406942 32593 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:13:56.476343 32593 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:13:56.477900 32593 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:13:56.478415 32593 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:13:56.570250 32593 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:36503
I20250624 02:13:56.570322 32675 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:36503 every 8 connection(s)
I20250624 02:13:56.573618 32593 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:13:56.577279 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 32593
I20250624 02:13:56.577865 31915 kudu-admin-test.cc:735] Forcing unsafe config change on tserver 61196268dddc47f0966f365f9829389c
I20250624 02:13:56.590963 32238 raft_consensus.cc:1273] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 2 FOLLOWER]: Refusing update from remote peer 11c448593b9b4152a40bf947653f01e2: Log matching property violated. Preceding OpId in replica: term: 1 index: 1. Preceding OpId from leader: term: 2 index: 2. (index mismatch)
I20250624 02:13:56.591660 32676 sys_catalog.cc:263] Verifying existing consensus state
I20250624 02:13:56.593031 32609 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [LEADER]: Connected to new peer: Peer: permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 2, Last known committed idx: 1, Time since last communication: 0.001s
I20250624 02:13:56.600939 32676 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Bootstrap starting.
I20250624 02:13:56.666257 32283 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
W20250624 02:13:56.700508 32038 consensus_peers.cc:487] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 -> Peer 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233): Couldn't send request to peer 293aefb746844e108ae0498be425a2ae. Status: Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250624 02:13:56.704563 32150 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
I20250624 02:13:56.710294 32676 log.cc:826] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Log is configured to *not* fsync() on all Append() calls
I20250624 02:13:56.745755 32416 heartbeater.cc:344] Connected to a master server at 127.31.42.254:36503
I20250624 02:13:56.746845 32676 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Bootstrap replayed 1/1 log segments. Stats: ops{read=7 overwritten=0 applied=7 ignored=0} inserts{seen=5 ignored=0} mutations{seen=2 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:13:56.748185 32676 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Bootstrap complete.
I20250624 02:13:56.778520 32676 raft_consensus.cc:357] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:56.781885 32676 raft_consensus.cc:738] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 524d9ddd4367420893a4c22505825b22, State: Initialized, Role: FOLLOWER
I20250624 02:13:56.783106 32676 consensus_queue.cc:260] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7, Last appended: 1.7, Last appended by leader: 7, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:56.783924 32676 raft_consensus.cc:397] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:13:56.784330 32676 raft_consensus.cc:491] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:13:56.784806 32676 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:13:56.794878 32676 raft_consensus.cc:513] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:56.796144 32676 leader_election.cc:304] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 524d9ddd4367420893a4c22505825b22; no voters:
I20250624 02:13:56.810554 32676 leader_election.cc:290] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [CANDIDATE]: Term 2 election: Requested vote from peers
I20250624 02:13:56.811146 32688 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:13:56.825340 32688 raft_consensus.cc:695] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [term 2 LEADER]: Becoming Leader. State: Replica: 524d9ddd4367420893a4c22505825b22, State: Running, Role: LEADER
I20250624 02:13:56.826835 32676 sys_catalog.cc:564] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:13:56.826522 32688 consensus_queue.cc:237] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7, Committed index: 7, Last appended: 1.7, Last appended by leader: 7, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } }
I20250624 02:13:56.848634 32689 sys_catalog.cc:455] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "524d9ddd4367420893a4c22505825b22" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } } }
I20250624 02:13:56.850143 32690 sys_catalog.cc:455] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 524d9ddd4367420893a4c22505825b22. Latest consensus state: current_term: 2 leader_uuid: "524d9ddd4367420893a4c22505825b22" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "524d9ddd4367420893a4c22505825b22" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 36503 } } }
I20250624 02:13:56.850970 32690 sys_catalog.cc:458] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: This master's current role is: LEADER
I20250624 02:13:56.852118 32689 sys_catalog.cc:458] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22 [sys.catalog]: This master's current role is: LEADER
I20250624 02:13:56.863168 32697 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:13:56.884775 32697 catalog_manager.cc:671] Loaded metadata for table TestTable [id=22095834d7a64d8fbe014eb049e2e8bc]
I20250624 02:13:56.901448 32697 tablet_loader.cc:96] loaded metadata for tablet 9cb47de9f2b743b4a245a41ae82cadd9 (table TestTable [id=22095834d7a64d8fbe014eb049e2e8bc])
I20250624 02:13:56.903327 32697 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:13:56.914894 32697 catalog_manager.cc:1261] Loaded cluster ID: a84600010588424e828f8667c956016b
I20250624 02:13:56.923723 32697 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:13:56.933326 32697 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:13:56.935241 32705 catalog_manager.cc:797] Waiting for catalog manager background task thread to start: Service unavailable: Catalog manager is not initialized. State: Starting
I20250624 02:13:56.940558 32697 catalog_manager.cc:5966] T 00000000000000000000000000000000 P 524d9ddd4367420893a4c22505825b22: Loaded TSK: 0
I20250624 02:13:56.942313 32697 catalog_manager.cc:1516] Initializing in-progress tserver states...
W20250624 02:13:57.035856 32678 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:13:57.036500 32678 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:13:57.069087 32678 flags.cc:425] Enabled experimental flag: --enable_multi_tenancy=false
I20250624 02:13:57.690320 32641 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "61196268dddc47f0966f365f9829389c" instance_seqno: 1750731229673273) as {username='slave'} at 127.31.42.194:43625; Asking this server to re-register.
I20250624 02:13:57.692219 32283 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:57.692888 32283 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:57.696265 32640 ts_manager.cc:194] Registered new tserver with Master: 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
I20250624 02:13:57.700170 32640 catalog_manager.cc:5582] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c reported cstate change: term changed from 1 to 2, leader changed from 293aefb746844e108ae0498be425a2ae (127.31.42.196) to 11c448593b9b4152a40bf947653f01e2 (127.31.42.193). New cstate: current_term: 2 leader_uuid: "11c448593b9b4152a40bf947653f01e2" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } }
I20250624 02:13:57.714452 32641 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "11c448593b9b4152a40bf947653f01e2" instance_seqno: 1750731227723163) as {username='slave'} at 127.31.42.193:55673; Asking this server to re-register.
I20250624 02:13:57.730640 32150 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:57.731608 32150 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:57.741009 32641 ts_manager.cc:194] Registered new tserver with Master: 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355)
I20250624 02:13:57.753154 32639 master_service.cc:432] Got heartbeat from unknown tserver (permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" instance_seqno: 1750731231575072) as {username='slave'} at 127.31.42.195:47907; Asking this server to re-register.
I20250624 02:13:57.755070 32416 heartbeater.cc:461] Registering TS with master...
I20250624 02:13:57.755750 32416 heartbeater.cc:507] Master 127.31.42.254:36503 requested a full tablet report, sending...
I20250624 02:13:57.758513 32639 ts_manager.cc:194] Registered new tserver with Master: 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195:46295)
W20250624 02:13:58.523141 32710 debug-util.cc:398] Leaking SignalData structure 0x7b0800036040 after lost signal to thread 32678
W20250624 02:13:58.524334 32710 kernel_stack_watchdog.cc:198] Thread 32678 stuck at /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/thread.cc:641 for 401ms:
Kernel stack:
(could not read kernel stack)
User stack:
<Timed out: thread did not respond: maybe it is blocking signals>
W20250624 02:13:58.829468 32678 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.710s user 0.592s sys 1.079s
W20250624 02:13:58.958168 32678 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.839s user 0.598s sys 1.090s
I20250624 02:13:59.003921 32237 tablet_service.cc:1905] Received UnsafeChangeConfig RPC: dest_uuid: "61196268dddc47f0966f365f9829389c"
tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9"
caller_id: "kudu-tools"
new_config {
peers {
permanent_uuid: "11c448593b9b4152a40bf947653f01e2"
}
peers {
permanent_uuid: "61196268dddc47f0966f365f9829389c"
}
}
from {username='slave'} at 127.0.0.1:35656
W20250624 02:13:59.005244 32237 raft_consensus.cc:2216] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 2 FOLLOWER]: PROCEEDING WITH UNSAFE CONFIG CHANGE ON THIS SERVER, COMMITTED CONFIG: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }NEW CONFIG: opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true
I20250624 02:13:59.006403 32237 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 2 FOLLOWER]: Advancing to term 3
W20250624 02:13:59.113301 32038 consensus_peers.cc:487] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 -> Peer 293aefb746844e108ae0498be425a2ae (127.31.42.196:45233): Couldn't send request to peer 293aefb746844e108ae0498be425a2ae. Status: Network error: Client connection negotiation failed: client connection to 127.31.42.196:45233: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20250624 02:13:59.419193 32237 raft_consensus.cc:1238] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 3 FOLLOWER]: Rejecting Update request from peer 11c448593b9b4152a40bf947653f01e2 for earlier term 2. Current term is 3. Ops: []
I20250624 02:13:59.420322 32720 consensus_queue.cc:1046] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [LEADER]: Peer responded invalid term: Peer: permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 }, Status: INVALID_TERM, Last received: 2.2, Next index: 3, Last known committed idx: 2, Time since last communication: 0.000s
I20250624 02:13:59.421545 32720 raft_consensus.cc:3053] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 2 LEADER]: Stepping down as leader of term 2
I20250624 02:13:59.421833 32720 raft_consensus.cc:738] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 2 LEADER]: Becoming Follower/Learner. State: Replica: 11c448593b9b4152a40bf947653f01e2, State: Running, Role: LEADER
I20250624 02:13:59.422447 32720 consensus_queue.cc:260] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 2.2, Last appended by leader: 2, Current term: 2, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } }
I20250624 02:13:59.423368 32720 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 2 FOLLOWER]: Advancing to term 3
W20250624 02:13:59.726703 32710 debug-util.cc:398] Leaking SignalData structure 0x7b0800036060 after lost signal to thread 32678
I20250624 02:14:00.514106 32737 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 3 FOLLOWER]: Starting pre-election (detected failure of leader kudu-tools)
I20250624 02:14:00.514515 32737 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 3 FOLLOWER]: Starting pre-election with config: opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true
I20250624 02:14:00.515621 32737 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 4 pre-election: Requested pre-vote from peers 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355)
I20250624 02:14:00.516880 32105 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "61196268dddc47f0966f365f9829389c" candidate_term: 4 candidate_status { last_received { term: 3 index: 3 } } ignore_live_leader: false dest_uuid: "11c448593b9b4152a40bf947653f01e2" is_pre_election: true
I20250624 02:14:00.517374 32105 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 3 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 61196268dddc47f0966f365f9829389c in term 3.
I20250624 02:14:00.518365 32171 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 4 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 2 voters: 2 yes votes; 0 no votes. yes voters: 11c448593b9b4152a40bf947653f01e2, 61196268dddc47f0966f365f9829389c; no voters:
I20250624 02:14:00.518959 32737 raft_consensus.cc:2802] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 3 FOLLOWER]: Leader pre-election won for term 4
I20250624 02:14:00.519213 32737 raft_consensus.cc:491] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 3 FOLLOWER]: Starting leader election (detected failure of leader kudu-tools)
I20250624 02:14:00.519430 32737 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 3 FOLLOWER]: Advancing to term 4
I20250624 02:14:00.523516 32737 raft_consensus.cc:513] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 4 FOLLOWER]: Starting leader election with config: opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true
I20250624 02:14:00.524533 32737 leader_election.cc:290] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 4 election: Requested vote from peers 11c448593b9b4152a40bf947653f01e2 (127.31.42.193:42355)
I20250624 02:14:00.525462 32105 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9" candidate_uuid: "61196268dddc47f0966f365f9829389c" candidate_term: 4 candidate_status { last_received { term: 3 index: 3 } } ignore_live_leader: false dest_uuid: "11c448593b9b4152a40bf947653f01e2"
I20250624 02:14:00.525880 32105 raft_consensus.cc:3058] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 3 FOLLOWER]: Advancing to term 4
I20250624 02:14:00.530102 32105 raft_consensus.cc:2466] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 61196268dddc47f0966f365f9829389c in term 4.
I20250624 02:14:00.530963 32171 leader_election.cc:304] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [CANDIDATE]: Term 4 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 2 voters: 2 yes votes; 0 no votes. yes voters: 11c448593b9b4152a40bf947653f01e2, 61196268dddc47f0966f365f9829389c; no voters:
I20250624 02:14:00.531548 32737 raft_consensus.cc:2802] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 4 FOLLOWER]: Leader election won for term 4
I20250624 02:14:00.532356 32737 raft_consensus.cc:695] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 4 LEADER]: Becoming Leader. State: Replica: 61196268dddc47f0966f365f9829389c, State: Running, Role: LEADER
I20250624 02:14:00.533149 32737 consensus_queue.cc:237] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 3.3, Last appended by leader: 0, Current term: 4, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true
I20250624 02:14:00.539469 32640 catalog_manager.cc:5582] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c reported cstate change: term changed from 2 to 4, leader changed from 11c448593b9b4152a40bf947653f01e2 (127.31.42.193) to 61196268dddc47f0966f365f9829389c (127.31.42.194), now has a pending config: VOTER 11c448593b9b4152a40bf947653f01e2 (127.31.42.193), VOTER 61196268dddc47f0966f365f9829389c (127.31.42.194). New cstate: current_term: 4 leader_uuid: "61196268dddc47f0966f365f9829389c" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "293aefb746844e108ae0498be425a2ae" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 45233 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } health_report { overall_health: HEALTHY } } } pending_config { opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true }
I20250624 02:14:01.037052 32105 raft_consensus.cc:1273] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Refusing update from remote peer 61196268dddc47f0966f365f9829389c: Log matching property violated. Preceding OpId in replica: term: 2 index: 2. Preceding OpId from leader: term: 4 index: 4. (index mismatch)
I20250624 02:14:01.038373 32737 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Connected to new peer: Peer: permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 4, Last known committed idx: 2, Time since last communication: 0.000s
I20250624 02:14:01.046766 32746 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 4 LEADER]: Committing config change with OpId 3.3: config changed from index -1 to 3, VOTER 293aefb746844e108ae0498be425a2ae (127.31.42.196) evicted. New config: { opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true }
I20250624 02:14:01.048209 32105 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Committing config change with OpId 3.3: config changed from index -1 to 3, VOTER 293aefb746844e108ae0498be425a2ae (127.31.42.196) evicted. New config: { opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } unsafe_config_change: true }
I20250624 02:14:01.060487 32640 catalog_manager.cc:5582] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c reported cstate change: config changed from index -1 to 3, VOTER 293aefb746844e108ae0498be425a2ae (127.31.42.196) evicted, no longer has a pending config: VOTER 11c448593b9b4152a40bf947653f01e2 (127.31.42.193), VOTER 61196268dddc47f0966f365f9829389c (127.31.42.194). New cstate: current_term: 4 leader_uuid: "61196268dddc47f0966f365f9829389c" committed_config { opid_index: 3 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } health_report { overall_health: HEALTHY } } unsafe_config_change: true }
W20250624 02:14:01.069728 32640 catalog_manager.cc:5774] Failed to send DeleteTablet RPC for tablet 9cb47de9f2b743b4a245a41ae82cadd9 on TS 293aefb746844e108ae0498be425a2ae: Not found: failed to reset TS proxy: Could not find TS for UUID 293aefb746844e108ae0498be425a2ae
I20250624 02:14:01.086031 32237 consensus_queue.cc:237] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 4, Committed index: 4, Last appended: 4.4, Last appended by leader: 0, Current term: 4, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 5 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: true } } unsafe_config_change: true
I20250624 02:14:01.092397 32105 raft_consensus.cc:1273] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Refusing update from remote peer 61196268dddc47f0966f365f9829389c: Log matching property violated. Preceding OpId in replica: term: 4 index: 4. Preceding OpId from leader: term: 4 index: 5. (index mismatch)
I20250624 02:14:01.093505 32737 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Connected to new peer: Peer: permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 5, Last known committed idx: 4, Time since last communication: 0.000s
I20250624 02:14:01.098819 32746 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 4 LEADER]: Committing config change with OpId 4.5: config changed from index 3 to 5, NON_VOTER 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) added. New config: { opid_index: 5 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: true } } unsafe_config_change: true }
I20250624 02:14:01.100365 32105 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Committing config change with OpId 4.5: config changed from index 3 to 5, NON_VOTER 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) added. New config: { opid_index: 5 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: true } } unsafe_config_change: true }
W20250624 02:14:01.101420 32171 consensus_peers.cc:487] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c -> Peer 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195:46295): Couldn't send request to peer 3a7ccbc73c7243fd9e789330a7c77baf. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 9cb47de9f2b743b4a245a41ae82cadd9. This is attempt 1: this message will repeat every 5th retry.
I20250624 02:14:01.107801 32627 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 9cb47de9f2b743b4a245a41ae82cadd9 with cas_config_opid_index 3: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250624 02:14:01.113507 32641 catalog_manager.cc:5582] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 reported cstate change: config changed from index 3 to 5, NON_VOTER 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) added. New cstate: current_term: 4 leader_uuid: "61196268dddc47f0966f365f9829389c" committed_config { opid_index: 5 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: true } } unsafe_config_change: true }
W20250624 02:14:01.122632 32626 catalog_manager.cc:4726] Async tablet task DeleteTablet RPC for tablet 9cb47de9f2b743b4a245a41ae82cadd9 on TS 293aefb746844e108ae0498be425a2ae failed: Not found: failed to reset TS proxy: Could not find TS for UUID 293aefb746844e108ae0498be425a2ae
I20250624 02:14:01.684877 32757 ts_tablet_manager.cc:927] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Initiating tablet copy from peer 61196268dddc47f0966f365f9829389c (127.31.42.194:35929)
I20250624 02:14:01.687216 32757 tablet_copy_client.cc:323] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: tablet copy: Beginning tablet copy session from remote peer at address 127.31.42.194:35929
I20250624 02:14:01.697701 32258 tablet_copy_service.cc:140] P 61196268dddc47f0966f365f9829389c: Received BeginTabletCopySession request for tablet 9cb47de9f2b743b4a245a41ae82cadd9 from peer 3a7ccbc73c7243fd9e789330a7c77baf ({username='slave'} at 127.31.42.195:43947)
I20250624 02:14:01.698156 32258 tablet_copy_service.cc:161] P 61196268dddc47f0966f365f9829389c: Beginning new tablet copy session on tablet 9cb47de9f2b743b4a245a41ae82cadd9 from peer 3a7ccbc73c7243fd9e789330a7c77baf at {username='slave'} at 127.31.42.195:43947: session id = 3a7ccbc73c7243fd9e789330a7c77baf-9cb47de9f2b743b4a245a41ae82cadd9
I20250624 02:14:01.702860 32258 tablet_copy_source_session.cc:215] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 02:14:01.708026 32757 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 9cb47de9f2b743b4a245a41ae82cadd9. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:01.727456 32757 tablet_copy_client.cc:806] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: tablet copy: Starting download of 0 data blocks...
I20250624 02:14:01.728152 32757 tablet_copy_client.cc:670] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: tablet copy: Starting download of 1 WAL segments...
I20250624 02:14:01.731736 32757 tablet_copy_client.cc:538] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 02:14:01.737799 32757 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Bootstrap starting.
I20250624 02:14:01.750378 32757 log.cc:826] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:01.762343 32757 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Bootstrap replayed 1/1 log segments. Stats: ops{read=5 overwritten=0 applied=5 ignored=0} inserts{seen=0 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:14:01.763118 32757 tablet_bootstrap.cc:492] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Bootstrap complete.
I20250624 02:14:01.763695 32757 ts_tablet_manager.cc:1397] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Time spent bootstrapping tablet: real 0.026s user 0.023s sys 0.005s
I20250624 02:14:01.782871 32757 raft_consensus.cc:357] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf [term 4 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 5 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: true } } unsafe_config_change: true
I20250624 02:14:01.783792 32757 raft_consensus.cc:738] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf [term 4 LEARNER]: Becoming Follower/Learner. State: Replica: 3a7ccbc73c7243fd9e789330a7c77baf, State: Initialized, Role: LEARNER
I20250624 02:14:01.784454 32757 consensus_queue.cc:260] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 5, Last appended: 4.5, Last appended by leader: 5, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 5 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: true } } unsafe_config_change: true
I20250624 02:14:01.788272 32757 ts_tablet_manager.cc:1428] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf: Time spent starting tablet: real 0.024s user 0.017s sys 0.007s
I20250624 02:14:01.789968 32258 tablet_copy_service.cc:342] P 61196268dddc47f0966f365f9829389c: Request end of tablet copy session 3a7ccbc73c7243fd9e789330a7c77baf-9cb47de9f2b743b4a245a41ae82cadd9 received from {username='slave'} at 127.31.42.195:43947
I20250624 02:14:01.790441 32258 tablet_copy_service.cc:434] P 61196268dddc47f0966f365f9829389c: ending tablet copy session 3a7ccbc73c7243fd9e789330a7c77baf-9cb47de9f2b743b4a245a41ae82cadd9 on tablet 9cb47de9f2b743b4a245a41ae82cadd9 with peer 3a7ccbc73c7243fd9e789330a7c77baf
I20250624 02:14:02.146071 32371 raft_consensus.cc:1215] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf [term 4 LEARNER]: Deduplicated request from leader. Original: 4.4->[4.5-4.5] Dedup: 4.5->[]
W20250624 02:14:02.288897 32626 catalog_manager.cc:4726] Async tablet task DeleteTablet RPC for tablet 9cb47de9f2b743b4a245a41ae82cadd9 on TS 293aefb746844e108ae0498be425a2ae failed: Not found: failed to reset TS proxy: Could not find TS for UUID 293aefb746844e108ae0498be425a2ae
I20250624 02:14:02.652086 32766 raft_consensus.cc:1062] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c: attempting to promote NON_VOTER 3a7ccbc73c7243fd9e789330a7c77baf to VOTER
I20250624 02:14:02.653681 32766 consensus_queue.cc:237] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 5, Committed index: 5, Last appended: 4.5, Last appended by leader: 0, Current term: 4, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 6 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: false } } unsafe_config_change: true
I20250624 02:14:02.658581 32371 raft_consensus.cc:1273] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf [term 4 LEARNER]: Refusing update from remote peer 61196268dddc47f0966f365f9829389c: Log matching property violated. Preceding OpId in replica: term: 4 index: 5. Preceding OpId from leader: term: 4 index: 6. (index mismatch)
I20250624 02:14:02.659518 32105 raft_consensus.cc:1273] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Refusing update from remote peer 61196268dddc47f0966f365f9829389c: Log matching property violated. Preceding OpId in replica: term: 4 index: 5. Preceding OpId from leader: term: 4 index: 6. (index mismatch)
I20250624 02:14:02.659812 32766 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Connected to new peer: Peer: permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6, Last known committed idx: 5, Time since last communication: 0.000s
I20250624 02:14:02.660727 32764 consensus_queue.cc:1035] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [LEADER]: Connected to new peer: Peer: permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 6, Last known committed idx: 5, Time since last communication: 0.000s
I20250624 02:14:02.666841 32766 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c [term 4 LEADER]: Committing config change with OpId 4.6: config changed from index 5 to 6, 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) changed from NON_VOTER to VOTER. New config: { opid_index: 6 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: false } } unsafe_config_change: true }
I20250624 02:14:02.668382 32105 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 11c448593b9b4152a40bf947653f01e2 [term 4 FOLLOWER]: Committing config change with OpId 4.6: config changed from index 5 to 6, 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) changed from NON_VOTER to VOTER. New config: { opid_index: 6 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: false } } unsafe_config_change: true }
I20250624 02:14:02.670037 32371 raft_consensus.cc:2953] T 9cb47de9f2b743b4a245a41ae82cadd9 P 3a7ccbc73c7243fd9e789330a7c77baf [term 4 FOLLOWER]: Committing config change with OpId 4.6: config changed from index 5 to 6, 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) changed from NON_VOTER to VOTER. New config: { opid_index: 6 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: false } } unsafe_config_change: true }
I20250624 02:14:02.679118 32641 catalog_manager.cc:5582] T 9cb47de9f2b743b4a245a41ae82cadd9 P 61196268dddc47f0966f365f9829389c reported cstate change: config changed from index 5 to 6, 3a7ccbc73c7243fd9e789330a7c77baf (127.31.42.195) changed from NON_VOTER to VOTER. New cstate: current_term: 4 leader_uuid: "61196268dddc47f0966f365f9829389c" committed_config { opid_index: 6 OBSOLETE_local: false peers { permanent_uuid: "11c448593b9b4152a40bf947653f01e2" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 42355 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "61196268dddc47f0966f365f9829389c" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35929 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 46295 } attrs { promote: false } health_report { overall_health: HEALTHY } } unsafe_config_change: true }
I20250624 02:14:02.688385 31915 kudu-admin-test.cc:751] Waiting for Master to see new config...
I20250624 02:14:02.708890 31915 kudu-admin-test.cc:756] Tablet locations:
tablet_locations {
tablet_id: "9cb47de9f2b743b4a245a41ae82cadd9"
DEPRECATED_stale: false
partition {
partition_key_start: ""
partition_key_end: ""
}
interned_replicas {
ts_info_idx: 0
role: FOLLOWER
}
interned_replicas {
ts_info_idx: 1
role: LEADER
}
interned_replicas {
ts_info_idx: 2
role: FOLLOWER
}
}
ts_infos {
permanent_uuid: "11c448593b9b4152a40bf947653f01e2"
rpc_addresses {
host: "127.31.42.193"
port: 42355
}
}
ts_infos {
permanent_uuid: "61196268dddc47f0966f365f9829389c"
rpc_addresses {
host: "127.31.42.194"
port: 35929
}
}
ts_infos {
permanent_uuid: "3a7ccbc73c7243fd9e789330a7c77baf"
rpc_addresses {
host: "127.31.42.195"
port: 46295
}
}
I20250624 02:14:02.711861 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 32021
I20250624 02:14:02.738044 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 32154
I20250624 02:14:02.767463 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 32287
I20250624 02:14:02.795727 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 32593
2025-06-24T02:14:02Z chronyd exiting
[ OK ] AdminCliTest.TestUnsafeChangeConfigForConfigWithTwoNodes (20334 ms)
[ RUN ] AdminCliTest.TestGracefulSpecificLeaderStepDown
I20250624 02:14:02.862118 31915 test_util.cc:276] Using random seed: -506202612
I20250624 02:14:02.868160 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:14:02.868333 31915 ts_itest-base.cc:116] --------------
I20250624 02:14:02.868446 31915 ts_itest-base.cc:117] 3 tablet servers
I20250624 02:14:02.868556 31915 ts_itest-base.cc:118] 3 replicas per TS
I20250624 02:14:02.868657 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:14:02Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:14:02Z Disabled control of system clock
I20250624 02:14:02.908514 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:33887
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:32785
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:33887
--catalog_manager_wait_for_new_tablets_to_elect_leader=false with env {}
W20250624 02:14:03.218772 318 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:03.219339 318 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:03.219760 318 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:03.251679 318 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:14:03.251986 318 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:03.252188 318 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:14:03.252383 318 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:14:03.288877 318 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:32785
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--catalog_manager_wait_for_new_tablets_to_elect_leader=false
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:33887
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:33887
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:03.290202 318 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:03.291919 318 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:03.309079 327 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:03.309104 325 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:03.310259 318 server_base.cc:1048] running on GCE node
W20250624 02:14:03.309264 324 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:04.488615 318 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:04.491394 318 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:04.492767 318 hybrid_clock.cc:648] HybridClock initialized: now 1750731244492741 us; error 46 us; skew 500 ppm
I20250624 02:14:04.493616 318 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:04.500161 318 webserver.cc:469] Webserver started at http://127.31.42.254:37001/ using document root <none> and password file <none>
I20250624 02:14:04.501114 318 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:04.501315 318 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:04.501793 318 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:04.506767 318 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "6965571ab4cb42a68b425b5f3c020753"
format_stamp: "Formatted at 2025-06-24 02:14:04 on dist-test-slave-5k9r"
I20250624 02:14:04.507942 318 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "6965571ab4cb42a68b425b5f3c020753"
format_stamp: "Formatted at 2025-06-24 02:14:04 on dist-test-slave-5k9r"
I20250624 02:14:04.515422 318 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.008s sys 0.000s
I20250624 02:14:04.521136 334 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:04.522226 318 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.001s
I20250624 02:14:04.522550 318 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "6965571ab4cb42a68b425b5f3c020753"
format_stamp: "Formatted at 2025-06-24 02:14:04 on dist-test-slave-5k9r"
I20250624 02:14:04.522888 318 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:04.577164 318 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:04.578711 318 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:04.579173 318 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:04.649705 318 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:33887
I20250624 02:14:04.649806 385 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:33887 every 8 connection(s)
I20250624 02:14:04.652448 318 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:14:04.656651 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 318
I20250624 02:14:04.657166 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:14:04.658481 386 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:04.684136 386 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753: Bootstrap starting.
I20250624 02:14:04.689785 386 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:04.691516 386 log.cc:826] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:04.696020 386 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753: No bootstrap required, opened a new log
I20250624 02:14:04.713805 386 raft_consensus.cc:357] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6965571ab4cb42a68b425b5f3c020753" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 33887 } }
I20250624 02:14:04.714529 386 raft_consensus.cc:383] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:04.714736 386 raft_consensus.cc:738] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6965571ab4cb42a68b425b5f3c020753, State: Initialized, Role: FOLLOWER
I20250624 02:14:04.715330 386 consensus_queue.cc:260] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6965571ab4cb42a68b425b5f3c020753" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 33887 } }
I20250624 02:14:04.715796 386 raft_consensus.cc:397] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:04.716017 386 raft_consensus.cc:491] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:04.716270 386 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:04.720803 386 raft_consensus.cc:513] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6965571ab4cb42a68b425b5f3c020753" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 33887 } }
I20250624 02:14:04.721467 386 leader_election.cc:304] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 6965571ab4cb42a68b425b5f3c020753; no voters:
I20250624 02:14:04.723148 386 leader_election.cc:290] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:04.723853 391 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:04.726073 391 raft_consensus.cc:695] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [term 1 LEADER]: Becoming Leader. State: Replica: 6965571ab4cb42a68b425b5f3c020753, State: Running, Role: LEADER
I20250624 02:14:04.726848 391 consensus_queue.cc:237] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6965571ab4cb42a68b425b5f3c020753" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 33887 } }
I20250624 02:14:04.727813 386 sys_catalog.cc:564] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:14:04.737586 393 sys_catalog.cc:455] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 6965571ab4cb42a68b425b5f3c020753. Latest consensus state: current_term: 1 leader_uuid: "6965571ab4cb42a68b425b5f3c020753" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6965571ab4cb42a68b425b5f3c020753" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 33887 } } }
I20250624 02:14:04.737841 392 sys_catalog.cc:455] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "6965571ab4cb42a68b425b5f3c020753" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6965571ab4cb42a68b425b5f3c020753" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 33887 } } }
I20250624 02:14:04.738557 393 sys_catalog.cc:458] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:04.738605 392 sys_catalog.cc:458] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753 [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:04.742519 400 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:14:04.755777 400 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:14:04.763278 408 catalog_manager.cc:797] Waiting for catalog manager background task thread to start: Service unavailable: Catalog manager is not initialized. State: Starting
I20250624 02:14:04.776786 400 catalog_manager.cc:1349] Generated new cluster ID: 422d63babfd94fd6b36feb907ef0d3e0
I20250624 02:14:04.777124 400 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:14:04.796185 400 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:14:04.797714 400 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:14:04.811573 400 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 6965571ab4cb42a68b425b5f3c020753: Generated new TSK 0
I20250624 02:14:04.812600 400 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:14:04.821245 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:33887
--builtin_ntp_servers=127.31.42.212:32785
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_leader_failure_detection=false with env {}
W20250624 02:14:05.147125 410 flags.cc:425] Enabled unsafe flag: --enable_leader_failure_detection=false
W20250624 02:14:05.147851 410 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:05.148139 410 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:05.148663 410 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:05.181758 410 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:05.182658 410 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:14:05.218194 410 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:32785
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_leader_failure_detection=false
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:33887
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:05.219497 410 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:05.221195 410 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:05.239667 417 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:05.240769 419 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:05.241207 416 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:05.243108 410 server_base.cc:1048] running on GCE node
I20250624 02:14:06.425896 410 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:06.428757 410 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:06.430322 410 hybrid_clock.cc:648] HybridClock initialized: now 1750731246430225 us; error 112 us; skew 500 ppm
I20250624 02:14:06.431159 410 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:06.438757 410 webserver.cc:469] Webserver started at http://127.31.42.193:46445/ using document root <none> and password file <none>
I20250624 02:14:06.440038 410 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:06.440321 410 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:06.440788 410 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:06.445839 410 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "928b4408e0a14b47b730f79383b1ca5d"
format_stamp: "Formatted at 2025-06-24 02:14:06 on dist-test-slave-5k9r"
I20250624 02:14:06.447178 410 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "928b4408e0a14b47b730f79383b1ca5d"
format_stamp: "Formatted at 2025-06-24 02:14:06 on dist-test-slave-5k9r"
I20250624 02:14:06.455993 410 fs_manager.cc:696] Time spent creating directory manager: real 0.008s user 0.005s sys 0.003s
I20250624 02:14:06.463006 426 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:06.464473 410 fs_manager.cc:730] Time spent opening block manager: real 0.005s user 0.002s sys 0.003s
I20250624 02:14:06.464887 410 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "928b4408e0a14b47b730f79383b1ca5d"
format_stamp: "Formatted at 2025-06-24 02:14:06 on dist-test-slave-5k9r"
I20250624 02:14:06.465226 410 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:06.529268 410 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:06.530917 410 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:06.531365 410 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:06.534564 410 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:06.538923 410 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:06.539134 410 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:06.539419 410 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:06.539575 410 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:06.693693 410 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:44327
I20250624 02:14:06.693849 538 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:44327 every 8 connection(s)
I20250624 02:14:06.696328 410 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:14:06.706700 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 410
I20250624 02:14:06.707180 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:14:06.713856 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:33887
--builtin_ntp_servers=127.31.42.212:32785
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_leader_failure_detection=false with env {}
I20250624 02:14:06.720610 539 heartbeater.cc:344] Connected to a master server at 127.31.42.254:33887
I20250624 02:14:06.721186 539 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:06.722576 539 heartbeater.cc:507] Master 127.31.42.254:33887 requested a full tablet report, sending...
I20250624 02:14:06.726059 351 ts_manager.cc:194] Registered new tserver with Master: 928b4408e0a14b47b730f79383b1ca5d (127.31.42.193:44327)
I20250624 02:14:06.729149 351 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:43395
W20250624 02:14:07.031977 543 flags.cc:425] Enabled unsafe flag: --enable_leader_failure_detection=false
W20250624 02:14:07.032578 543 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:07.032788 543 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:07.033236 543 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:07.067261 543 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:07.068220 543 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:14:07.104529 543 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:32785
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_leader_failure_detection=false
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:33887
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:07.105846 543 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:07.107582 543 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:07.124974 550 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:07.734161 539 heartbeater.cc:499] Master 127.31.42.254:33887 was elected leader, sending a full tablet report...
W20250624 02:14:07.128275 552 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:07.125427 549 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:08.300714 551 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:14:08.300803 543 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:08.304800 543 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:08.307524 543 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:08.308980 543 hybrid_clock.cc:648] HybridClock initialized: now 1750731248308940 us; error 59 us; skew 500 ppm
I20250624 02:14:08.309782 543 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:08.320807 543 webserver.cc:469] Webserver started at http://127.31.42.194:36241/ using document root <none> and password file <none>
I20250624 02:14:08.321799 543 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:08.322057 543 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:08.322481 543 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:08.327224 543 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/instance:
uuid: "dae230fe241043ba8446507c5eece432"
format_stamp: "Formatted at 2025-06-24 02:14:08 on dist-test-slave-5k9r"
I20250624 02:14:08.328495 543 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance:
uuid: "dae230fe241043ba8446507c5eece432"
format_stamp: "Formatted at 2025-06-24 02:14:08 on dist-test-slave-5k9r"
I20250624 02:14:08.335980 543 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.001s sys 0.008s
I20250624 02:14:08.341964 559 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:08.343228 543 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 02:14:08.343575 543 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "dae230fe241043ba8446507c5eece432"
format_stamp: "Formatted at 2025-06-24 02:14:08 on dist-test-slave-5k9r"
I20250624 02:14:08.343940 543 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:08.394469 543 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:08.395998 543 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:08.396458 543 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:08.399040 543 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:08.403661 543 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:08.403898 543 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:08.404162 543 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:08.404326 543 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:08.545490 543 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:39911
I20250624 02:14:08.545605 671 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:39911 every 8 connection(s)
I20250624 02:14:08.548038 543 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:14:08.557346 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 543
I20250624 02:14:08.557751 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance
I20250624 02:14:08.563830 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:33887
--builtin_ntp_servers=127.31.42.212:32785
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_leader_failure_detection=false with env {}
I20250624 02:14:08.570021 672 heartbeater.cc:344] Connected to a master server at 127.31.42.254:33887
I20250624 02:14:08.570472 672 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:08.571537 672 heartbeater.cc:507] Master 127.31.42.254:33887 requested a full tablet report, sending...
I20250624 02:14:08.574067 351 ts_manager.cc:194] Registered new tserver with Master: dae230fe241043ba8446507c5eece432 (127.31.42.194:39911)
I20250624 02:14:08.575991 351 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:60405
W20250624 02:14:08.887840 676 flags.cc:425] Enabled unsafe flag: --enable_leader_failure_detection=false
W20250624 02:14:08.888448 676 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:08.888659 676 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:08.889099 676 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:08.920507 676 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:08.921396 676 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:14:08.958510 676 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:32785
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--enable_leader_failure_detection=false
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:33887
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:08.959758 676 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:08.961476 676 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:08.977018 682 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:09.579831 672 heartbeater.cc:499] Master 127.31.42.254:33887 was elected leader, sending a full tablet report...
W20250624 02:14:08.978822 683 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:08.977684 685 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:10.195118 684 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1214 milliseconds
I20250624 02:14:10.195253 676 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:10.196753 676 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:10.198976 676 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:10.200325 676 hybrid_clock.cc:648] HybridClock initialized: now 1750731250200301 us; error 44 us; skew 500 ppm
I20250624 02:14:10.201184 676 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:10.207664 676 webserver.cc:469] Webserver started at http://127.31.42.195:34877/ using document root <none> and password file <none>
I20250624 02:14:10.208648 676 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:10.208866 676 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:10.209331 676 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:10.213913 676 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/instance:
uuid: "25fc677d655f4cd1afb99e435d05848b"
format_stamp: "Formatted at 2025-06-24 02:14:10 on dist-test-slave-5k9r"
I20250624 02:14:10.215135 676 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance:
uuid: "25fc677d655f4cd1afb99e435d05848b"
format_stamp: "Formatted at 2025-06-24 02:14:10 on dist-test-slave-5k9r"
I20250624 02:14:10.222455 676 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.001s
I20250624 02:14:10.228207 692 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:10.229285 676 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.005s sys 0.001s
I20250624 02:14:10.229698 676 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "25fc677d655f4cd1afb99e435d05848b"
format_stamp: "Formatted at 2025-06-24 02:14:10 on dist-test-slave-5k9r"
I20250624 02:14:10.230093 676 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:10.282274 676 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:10.283740 676 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:10.284204 676 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:10.286784 676 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:10.290989 676 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:10.291198 676 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:10.291464 676 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:10.291615 676 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:10.424247 676 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:35737
I20250624 02:14:10.424355 804 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:35737 every 8 connection(s)
I20250624 02:14:10.427109 676 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:14:10.428063 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 676
I20250624 02:14:10.428485 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestGracefulSpecificLeaderStepDown.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance
I20250624 02:14:10.450783 805 heartbeater.cc:344] Connected to a master server at 127.31.42.254:33887
I20250624 02:14:10.451296 805 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:10.452306 805 heartbeater.cc:507] Master 127.31.42.254:33887 requested a full tablet report, sending...
I20250624 02:14:10.454560 351 ts_manager.cc:194] Registered new tserver with Master: 25fc677d655f4cd1afb99e435d05848b (127.31.42.195:35737)
I20250624 02:14:10.455933 351 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:48717
I20250624 02:14:10.464754 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:10.499743 351 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:48686:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
owner: "alice"
W20250624 02:14:10.520418 351 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table TestTable in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 02:14:10.580737 740 tablet_service.cc:1468] Processing CreateTablet for tablet a63dc6b1d4194ad99eec6a1f2d901af5 (DEFAULT_TABLE table=TestTable [id=727a0a66889142f7961929d0234f12cb]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:10.583701 740 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a63dc6b1d4194ad99eec6a1f2d901af5. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:10.587339 474 tablet_service.cc:1468] Processing CreateTablet for tablet a63dc6b1d4194ad99eec6a1f2d901af5 (DEFAULT_TABLE table=TestTable [id=727a0a66889142f7961929d0234f12cb]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:10.589406 474 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a63dc6b1d4194ad99eec6a1f2d901af5. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:10.590408 607 tablet_service.cc:1468] Processing CreateTablet for tablet a63dc6b1d4194ad99eec6a1f2d901af5 (DEFAULT_TABLE table=TestTable [id=727a0a66889142f7961929d0234f12cb]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:10.592450 607 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a63dc6b1d4194ad99eec6a1f2d901af5. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:10.615862 824 tablet_bootstrap.cc:492] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: Bootstrap starting.
I20250624 02:14:10.616662 825 tablet_bootstrap.cc:492] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: Bootstrap starting.
I20250624 02:14:10.619184 826 tablet_bootstrap.cc:492] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: Bootstrap starting.
I20250624 02:14:10.624593 825 tablet_bootstrap.cc:654] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:10.624902 826 tablet_bootstrap.cc:654] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:10.625808 824 tablet_bootstrap.cc:654] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:10.626876 826 log.cc:826] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:10.627174 825 log.cc:826] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:10.628234 824 log.cc:826] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:10.631916 826 tablet_bootstrap.cc:492] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: No bootstrap required, opened a new log
I20250624 02:14:10.632400 826 ts_tablet_manager.cc:1397] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: Time spent bootstrapping tablet: real 0.014s user 0.008s sys 0.005s
I20250624 02:14:10.634178 824 tablet_bootstrap.cc:492] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: No bootstrap required, opened a new log
I20250624 02:14:10.634706 824 ts_tablet_manager.cc:1397] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: Time spent bootstrapping tablet: real 0.021s user 0.013s sys 0.003s
I20250624 02:14:10.635057 825 tablet_bootstrap.cc:492] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: No bootstrap required, opened a new log
I20250624 02:14:10.635807 825 ts_tablet_manager.cc:1397] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: Time spent bootstrapping tablet: real 0.020s user 0.007s sys 0.010s
I20250624 02:14:10.653659 826 raft_consensus.cc:357] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.654992 826 raft_consensus.cc:738] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 928b4408e0a14b47b730f79383b1ca5d, State: Initialized, Role: FOLLOWER
I20250624 02:14:10.655867 826 consensus_queue.cc:260] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.662621 825 raft_consensus.cc:357] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.663643 825 raft_consensus.cc:738] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 25fc677d655f4cd1afb99e435d05848b, State: Initialized, Role: FOLLOWER
I20250624 02:14:10.664644 825 consensus_queue.cc:260] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.666662 826 ts_tablet_manager.cc:1428] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: Time spent starting tablet: real 0.034s user 0.025s sys 0.008s
I20250624 02:14:10.667887 824 raft_consensus.cc:357] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.669193 824 raft_consensus.cc:738] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: dae230fe241043ba8446507c5eece432, State: Initialized, Role: FOLLOWER
I20250624 02:14:10.670290 824 consensus_queue.cc:260] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.674840 805 heartbeater.cc:499] Master 127.31.42.254:33887 was elected leader, sending a full tablet report...
I20250624 02:14:10.680555 825 ts_tablet_manager.cc:1428] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: Time spent starting tablet: real 0.044s user 0.033s sys 0.005s
W20250624 02:14:10.683378 806 tablet.cc:2378] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:10.686197 824 ts_tablet_manager.cc:1428] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: Time spent starting tablet: real 0.051s user 0.024s sys 0.016s
W20250624 02:14:10.705765 540 tablet.cc:2378] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:10.712644 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:10.716161 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 928b4408e0a14b47b730f79383b1ca5d to finish bootstrapping
I20250624 02:14:10.730335 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver dae230fe241043ba8446507c5eece432 to finish bootstrapping
I20250624 02:14:10.742224 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 25fc677d655f4cd1afb99e435d05848b to finish bootstrapping
I20250624 02:14:10.789697 494 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5"
dest_uuid: "928b4408e0a14b47b730f79383b1ca5d"
from {username='slave'} at 127.0.0.1:39036
I20250624 02:14:10.790339 494 raft_consensus.cc:491] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 0 FOLLOWER]: Starting forced leader election (received explicit request)
I20250624 02:14:10.790643 494 raft_consensus.cc:3058] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:10.795102 494 raft_consensus.cc:513] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.797552 494 leader_election.cc:290] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [CANDIDATE]: Term 1 election: Requested vote from peers 25fc677d655f4cd1afb99e435d05848b (127.31.42.195:35737), dae230fe241043ba8446507c5eece432 (127.31.42.194:39911)
W20250624 02:14:10.805136 673 tablet.cc:2378] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:10.807727 31915 cluster_itest_util.cc:257] Not converged past 1 yet: 0.0 0.0 0.0
I20250624 02:14:10.810711 760 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5" candidate_uuid: "928b4408e0a14b47b730f79383b1ca5d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: true dest_uuid: "25fc677d655f4cd1afb99e435d05848b"
I20250624 02:14:10.811455 760 raft_consensus.cc:3058] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:10.811482 627 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5" candidate_uuid: "928b4408e0a14b47b730f79383b1ca5d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: true dest_uuid: "dae230fe241043ba8446507c5eece432"
I20250624 02:14:10.812213 627 raft_consensus.cc:3058] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:10.818787 760 raft_consensus.cc:2466] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 928b4408e0a14b47b730f79383b1ca5d in term 1.
I20250624 02:14:10.819151 627 raft_consensus.cc:2466] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 928b4408e0a14b47b730f79383b1ca5d in term 1.
I20250624 02:14:10.820520 428 leader_election.cc:304] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 25fc677d655f4cd1afb99e435d05848b, 928b4408e0a14b47b730f79383b1ca5d; no voters:
I20250624 02:14:10.821686 830 raft_consensus.cc:2802] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:10.824837 830 raft_consensus.cc:695] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 LEADER]: Becoming Leader. State: Replica: 928b4408e0a14b47b730f79383b1ca5d, State: Running, Role: LEADER
I20250624 02:14:10.825989 830 consensus_queue.cc:237] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:10.836328 351 catalog_manager.cc:5582] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d reported cstate change: term changed from 0 to 1, leader changed from <none> to 928b4408e0a14b47b730f79383b1ca5d (127.31.42.193). New cstate: current_term: 1 leader_uuid: "928b4408e0a14b47b730f79383b1ca5d" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } health_report { overall_health: UNKNOWN } } }
I20250624 02:14:10.913669 31915 cluster_itest_util.cc:257] Not converged past 1 yet: 1.1 0.0 0.0
I20250624 02:14:11.119277 31915 cluster_itest_util.cc:257] Not converged past 1 yet: 1.1 0.0 0.0
I20250624 02:14:11.286015 830 consensus_queue.cc:1035] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [LEADER]: Connected to new peer: Peer: permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:14:11.304255 843 consensus_queue.cc:1035] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [LEADER]: Connected to new peer: Peer: permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:14:13.114087 494 tablet_service.cc:1968] Received LeaderStepDown RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5"
dest_uuid: "928b4408e0a14b47b730f79383b1ca5d"
mode: GRACEFUL
from {username='slave'} at 127.0.0.1:39048
I20250624 02:14:13.114581 494 raft_consensus.cc:604] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 LEADER]: Received request to transfer leadership
I20250624 02:14:13.321466 868 raft_consensus.cc:991] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d: : Instructing follower dae230fe241043ba8446507c5eece432 to start an election
I20250624 02:14:13.321815 843 raft_consensus.cc:1079] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 LEADER]: Signalling peer dae230fe241043ba8446507c5eece432 to start an election
I20250624 02:14:13.323153 627 tablet_service.cc:1940] Received Run Leader Election RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5"
dest_uuid: "dae230fe241043ba8446507c5eece432"
from {username='slave'} at 127.31.42.193:45807
I20250624 02:14:13.323661 627 raft_consensus.cc:491] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 1 FOLLOWER]: Starting forced leader election (received explicit request)
I20250624 02:14:13.323940 627 raft_consensus.cc:3058] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:14:13.328037 627 raft_consensus.cc:513] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 2 FOLLOWER]: Starting forced leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:13.330303 627 leader_election.cc:290] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [CANDIDATE]: Term 2 election: Requested vote from peers 25fc677d655f4cd1afb99e435d05848b (127.31.42.195:35737), 928b4408e0a14b47b730f79383b1ca5d (127.31.42.193:44327)
I20250624 02:14:13.343564 760 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5" candidate_uuid: "dae230fe241043ba8446507c5eece432" candidate_term: 2 candidate_status { last_received { term: 1 index: 1 } } ignore_live_leader: true dest_uuid: "25fc677d655f4cd1afb99e435d05848b"
I20250624 02:14:13.344033 760 raft_consensus.cc:3058] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:14:13.343981 494 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5" candidate_uuid: "dae230fe241043ba8446507c5eece432" candidate_term: 2 candidate_status { last_received { term: 1 index: 1 } } ignore_live_leader: true dest_uuid: "928b4408e0a14b47b730f79383b1ca5d"
I20250624 02:14:13.344475 494 raft_consensus.cc:3053] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 LEADER]: Stepping down as leader of term 1
I20250624 02:14:13.344724 494 raft_consensus.cc:738] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 LEADER]: Becoming Follower/Learner. State: Replica: 928b4408e0a14b47b730f79383b1ca5d, State: Running, Role: LEADER
I20250624 02:14:13.345191 494 consensus_queue.cc:260] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 1, Committed index: 1, Last appended: 1.1, Last appended by leader: 1, Current term: 1, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:13.346119 494 raft_consensus.cc:3058] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:14:13.348227 760 raft_consensus.cc:2466] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dae230fe241043ba8446507c5eece432 in term 2.
I20250624 02:14:13.349251 561 leader_election.cc:304] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 25fc677d655f4cd1afb99e435d05848b, dae230fe241043ba8446507c5eece432; no voters:
I20250624 02:14:13.351130 494 raft_consensus.cc:2466] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate dae230fe241043ba8446507c5eece432 in term 2.
I20250624 02:14:13.351408 872 raft_consensus.cc:2802] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:14:13.352922 872 raft_consensus.cc:695] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [term 2 LEADER]: Becoming Leader. State: Replica: dae230fe241043ba8446507c5eece432, State: Running, Role: LEADER
I20250624 02:14:13.353906 872 consensus_queue.cc:237] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 1, Committed index: 1, Last appended: 1.1, Last appended by leader: 1, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } }
I20250624 02:14:13.361678 350 catalog_manager.cc:5582] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 reported cstate change: term changed from 1 to 2, leader changed from 928b4408e0a14b47b730f79383b1ca5d (127.31.42.193) to dae230fe241043ba8446507c5eece432 (127.31.42.194). New cstate: current_term: 2 leader_uuid: "dae230fe241043ba8446507c5eece432" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "dae230fe241043ba8446507c5eece432" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 39911 } health_report { overall_health: HEALTHY } } }
I20250624 02:14:13.794948 760 raft_consensus.cc:1273] T a63dc6b1d4194ad99eec6a1f2d901af5 P 25fc677d655f4cd1afb99e435d05848b [term 2 FOLLOWER]: Refusing update from remote peer dae230fe241043ba8446507c5eece432: Log matching property violated. Preceding OpId in replica: term: 1 index: 1. Preceding OpId from leader: term: 2 index: 2. (index mismatch)
I20250624 02:14:13.796343 872 consensus_queue.cc:1035] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [LEADER]: Connected to new peer: Peer: permanent_uuid: "25fc677d655f4cd1afb99e435d05848b" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35737 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 2, Last known committed idx: 1, Time since last communication: 0.000s
I20250624 02:14:13.808322 494 raft_consensus.cc:1273] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 2 FOLLOWER]: Refusing update from remote peer dae230fe241043ba8446507c5eece432: Log matching property violated. Preceding OpId in replica: term: 1 index: 1. Preceding OpId from leader: term: 2 index: 2. (index mismatch)
I20250624 02:14:13.810029 873 consensus_queue.cc:1035] T a63dc6b1d4194ad99eec6a1f2d901af5 P dae230fe241043ba8446507c5eece432 [LEADER]: Connected to new peer: Peer: permanent_uuid: "928b4408e0a14b47b730f79383b1ca5d" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 44327 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 2, Last known committed idx: 1, Time since last communication: 0.001s
I20250624 02:14:15.867166 494 tablet_service.cc:1968] Received LeaderStepDown RPC: tablet_id: "a63dc6b1d4194ad99eec6a1f2d901af5"
dest_uuid: "928b4408e0a14b47b730f79383b1ca5d"
mode: GRACEFUL
from {username='slave'} at 127.0.0.1:35618
I20250624 02:14:15.867791 494 raft_consensus.cc:604] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 2 FOLLOWER]: Received request to transfer leadership
I20250624 02:14:15.868113 494 raft_consensus.cc:612] T a63dc6b1d4194ad99eec6a1f2d901af5 P 928b4408e0a14b47b730f79383b1ca5d [term 2 FOLLOWER]: Rejecting request to transer leadership while not leader
I20250624 02:14:16.904495 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 410
I20250624 02:14:16.933107 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 543
I20250624 02:14:16.960930 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 676
I20250624 02:14:16.988344 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 318
2025-06-24T02:14:17Z chronyd exiting
[ OK ] AdminCliTest.TestGracefulSpecificLeaderStepDown (14185 ms)
[ RUN ] AdminCliTest.TestDescribeTableColumnFlags
I20250624 02:14:17.047646 31915 test_util.cc:276] Using random seed: -492017086
I20250624 02:14:17.051916 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:14:17.052084 31915 ts_itest-base.cc:116] --------------
I20250624 02:14:17.052264 31915 ts_itest-base.cc:117] 3 tablet servers
I20250624 02:14:17.052399 31915 ts_itest-base.cc:118] 3 replicas per TS
I20250624 02:14:17.052534 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:14:17Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:14:17Z Disabled control of system clock
I20250624 02:14:17.090976 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:46783
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:46495
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:46783 with env {}
W20250624 02:14:17.395874 913 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:17.396483 913 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:17.396960 913 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:17.428400 913 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:14:17.428769 913 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:17.429030 913 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:14:17.429268 913 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:14:17.465813 913 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:46495
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:46783
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:46783
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:17.467199 913 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:17.468868 913 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:17.483316 919 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:17.484755 920 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:17.485780 922 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:17.485774 913 server_base.cc:1048] running on GCE node
I20250624 02:14:18.670012 913 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:18.673311 913 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:18.674805 913 hybrid_clock.cc:648] HybridClock initialized: now 1750731258674761 us; error 59 us; skew 500 ppm
I20250624 02:14:18.675633 913 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:18.682245 913 webserver.cc:469] Webserver started at http://127.31.42.254:34737/ using document root <none> and password file <none>
I20250624 02:14:18.683182 913 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:18.683373 913 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:18.683801 913 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:18.688297 913 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "ea577127edbe4fb9a6ee43b43d3837fe"
format_stamp: "Formatted at 2025-06-24 02:14:18 on dist-test-slave-5k9r"
I20250624 02:14:18.689360 913 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "ea577127edbe4fb9a6ee43b43d3837fe"
format_stamp: "Formatted at 2025-06-24 02:14:18 on dist-test-slave-5k9r"
I20250624 02:14:18.696676 913 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.009s sys 0.000s
I20250624 02:14:18.702728 929 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:18.703887 913 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.002s
I20250624 02:14:18.704260 913 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "ea577127edbe4fb9a6ee43b43d3837fe"
format_stamp: "Formatted at 2025-06-24 02:14:18 on dist-test-slave-5k9r"
I20250624 02:14:18.704629 913 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:18.794812 913 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:18.796299 913 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:18.796769 913 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:18.869879 913 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:46783
I20250624 02:14:18.870016 980 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:46783 every 8 connection(s)
I20250624 02:14:18.872628 913 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:14:18.877808 981 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:18.879312 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 913
I20250624 02:14:18.879799 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:14:18.904824 981 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe: Bootstrap starting.
I20250624 02:14:18.910707 981 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:18.912498 981 log.cc:826] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:18.917225 981 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe: No bootstrap required, opened a new log
I20250624 02:14:18.934927 981 raft_consensus.cc:357] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46783 } }
I20250624 02:14:18.935613 981 raft_consensus.cc:383] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:18.935892 981 raft_consensus.cc:738] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: ea577127edbe4fb9a6ee43b43d3837fe, State: Initialized, Role: FOLLOWER
I20250624 02:14:18.936618 981 consensus_queue.cc:260] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46783 } }
I20250624 02:14:18.937125 981 raft_consensus.cc:397] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:18.937417 981 raft_consensus.cc:491] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:18.937700 981 raft_consensus.cc:3058] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:18.942139 981 raft_consensus.cc:513] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46783 } }
I20250624 02:14:18.942867 981 leader_election.cc:304] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: ea577127edbe4fb9a6ee43b43d3837fe; no voters:
I20250624 02:14:18.944542 981 leader_election.cc:290] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:18.945333 986 raft_consensus.cc:2802] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:18.947497 986 raft_consensus.cc:695] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [term 1 LEADER]: Becoming Leader. State: Replica: ea577127edbe4fb9a6ee43b43d3837fe, State: Running, Role: LEADER
I20250624 02:14:18.948194 986 consensus_queue.cc:237] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46783 } }
I20250624 02:14:18.948575 981 sys_catalog.cc:564] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:14:18.960368 987 sys_catalog.cc:455] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46783 } } }
I20250624 02:14:18.961737 988 sys_catalog.cc:455] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [sys.catalog]: SysCatalogTable state changed. Reason: New leader ea577127edbe4fb9a6ee43b43d3837fe. Latest consensus state: current_term: 1 leader_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "ea577127edbe4fb9a6ee43b43d3837fe" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46783 } } }
I20250624 02:14:18.962738 988 sys_catalog.cc:458] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:18.964439 987 sys_catalog.cc:458] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:18.965641 996 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:14:18.975874 996 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:14:18.992866 996 catalog_manager.cc:1349] Generated new cluster ID: 1b3c4ba665d14f2fb452d8a9dbf21204
I20250624 02:14:18.993206 996 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:14:19.030150 996 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:14:19.032279 996 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:14:19.056432 996 catalog_manager.cc:5955] T 00000000000000000000000000000000 P ea577127edbe4fb9a6ee43b43d3837fe: Generated new TSK 0
I20250624 02:14:19.057607 996 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:14:19.085569 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46783
--builtin_ntp_servers=127.31.42.212:46495
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
W20250624 02:14:19.395254 1005 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:19.395781 1005 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:19.396276 1005 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:19.428632 1005 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:19.429481 1005 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:14:19.466291 1005 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:46495
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46783
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:19.467623 1005 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:19.469275 1005 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:19.488134 1012 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:19.489624 1005 server_base.cc:1048] running on GCE node
W20250624 02:14:19.488145 1011 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:19.488958 1014 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:20.688643 1005 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:20.691562 1005 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:20.693087 1005 hybrid_clock.cc:648] HybridClock initialized: now 1750731260693035 us; error 56 us; skew 500 ppm
I20250624 02:14:20.694201 1005 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:20.706601 1005 webserver.cc:469] Webserver started at http://127.31.42.193:39643/ using document root <none> and password file <none>
I20250624 02:14:20.707901 1005 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:20.708178 1005 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:20.708745 1005 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:20.715319 1005 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "9152818822e846d787eabcf497a88578"
format_stamp: "Formatted at 2025-06-24 02:14:20 on dist-test-slave-5k9r"
I20250624 02:14:20.716866 1005 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "9152818822e846d787eabcf497a88578"
format_stamp: "Formatted at 2025-06-24 02:14:20 on dist-test-slave-5k9r"
I20250624 02:14:20.725081 1005 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.009s sys 0.001s
I20250624 02:14:20.730968 1021 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:20.732062 1005 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.002s
I20250624 02:14:20.732379 1005 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "9152818822e846d787eabcf497a88578"
format_stamp: "Formatted at 2025-06-24 02:14:20 on dist-test-slave-5k9r"
I20250624 02:14:20.732712 1005 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:20.788393 1005 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:20.789882 1005 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:20.790371 1005 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:20.792958 1005 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:20.797176 1005 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:20.797382 1005 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:20.797645 1005 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:20.797799 1005 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:20.939407 1005 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:33637
I20250624 02:14:20.939520 1133 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:33637 every 8 connection(s)
I20250624 02:14:20.941987 1005 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:14:20.945008 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1005
I20250624 02:14:20.945531 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:14:20.952659 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46783
--builtin_ntp_servers=127.31.42.212:46495
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:14:20.963589 1134 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46783
I20250624 02:14:20.964184 1134 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:20.965598 1134 heartbeater.cc:507] Master 127.31.42.254:46783 requested a full tablet report, sending...
I20250624 02:14:20.968701 946 ts_manager.cc:194] Registered new tserver with Master: 9152818822e846d787eabcf497a88578 (127.31.42.193:33637)
I20250624 02:14:20.970782 946 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:55389
W20250624 02:14:21.258133 1138 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:21.258651 1138 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:21.259160 1138 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:21.290481 1138 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:21.291342 1138 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:14:21.327226 1138 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:46495
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46783
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:21.328616 1138 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:21.330288 1138 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:21.347628 1147 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:21.975078 1134 heartbeater.cc:499] Master 127.31.42.254:46783 was elected leader, sending a full tablet report...
W20250624 02:14:21.347681 1145 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:21.348079 1144 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:22.509141 1146 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:14:22.509209 1138 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:22.513751 1138 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:22.516449 1138 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:22.517879 1138 hybrid_clock.cc:648] HybridClock initialized: now 1750731262517835 us; error 60 us; skew 500 ppm
I20250624 02:14:22.518708 1138 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:22.525409 1138 webserver.cc:469] Webserver started at http://127.31.42.194:33241/ using document root <none> and password file <none>
I20250624 02:14:22.526376 1138 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:22.526587 1138 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:22.527046 1138 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:22.531461 1138 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/instance:
uuid: "f413e64320ad4155934538662f239e5e"
format_stamp: "Formatted at 2025-06-24 02:14:22 on dist-test-slave-5k9r"
I20250624 02:14:22.532538 1138 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance:
uuid: "f413e64320ad4155934538662f239e5e"
format_stamp: "Formatted at 2025-06-24 02:14:22 on dist-test-slave-5k9r"
I20250624 02:14:22.539778 1138 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.004s sys 0.004s
I20250624 02:14:22.545506 1154 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:22.546712 1138 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 02:14:22.547019 1138 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "f413e64320ad4155934538662f239e5e"
format_stamp: "Formatted at 2025-06-24 02:14:22 on dist-test-slave-5k9r"
I20250624 02:14:22.547338 1138 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:22.604475 1138 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:22.605973 1138 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:22.606395 1138 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:22.608975 1138 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:22.614284 1138 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:22.614496 1138 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:22.614733 1138 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:22.614871 1138 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:22.748862 1138 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:40735
I20250624 02:14:22.749018 1267 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:40735 every 8 connection(s)
I20250624 02:14:22.751534 1138 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:14:22.760785 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1138
I20250624 02:14:22.761516 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance
I20250624 02:14:22.768507 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46783
--builtin_ntp_servers=127.31.42.212:46495
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:14:22.774740 1268 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46783
I20250624 02:14:22.775270 1268 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:22.776342 1268 heartbeater.cc:507] Master 127.31.42.254:46783 requested a full tablet report, sending...
I20250624 02:14:22.778584 946 ts_manager.cc:194] Registered new tserver with Master: f413e64320ad4155934538662f239e5e (127.31.42.194:40735)
I20250624 02:14:22.779788 946 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:57221
W20250624 02:14:23.074013 1272 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:23.074501 1272 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:23.074996 1272 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:23.106426 1272 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:23.107303 1272 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:14:23.142099 1272 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:46495
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46783
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:23.143534 1272 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:23.145336 1272 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:23.162154 1279 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:23.783664 1268 heartbeater.cc:499] Master 127.31.42.254:46783 was elected leader, sending a full tablet report...
W20250624 02:14:23.162492 1278 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:23.163458 1281 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:24.350289 1280 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1186 milliseconds
I20250624 02:14:24.350420 1272 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:24.351749 1272 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:24.354468 1272 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:24.355907 1272 hybrid_clock.cc:648] HybridClock initialized: now 1750731264355870 us; error 60 us; skew 500 ppm
I20250624 02:14:24.356722 1272 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:24.363574 1272 webserver.cc:469] Webserver started at http://127.31.42.195:39513/ using document root <none> and password file <none>
I20250624 02:14:24.364548 1272 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:24.364758 1272 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:24.365223 1272 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:24.369733 1272 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/instance:
uuid: "19358d5bb43a4a9bbc4d63f54fbe421d"
format_stamp: "Formatted at 2025-06-24 02:14:24 on dist-test-slave-5k9r"
I20250624 02:14:24.370945 1272 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance:
uuid: "19358d5bb43a4a9bbc4d63f54fbe421d"
format_stamp: "Formatted at 2025-06-24 02:14:24 on dist-test-slave-5k9r"
I20250624 02:14:24.378186 1272 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.002s sys 0.005s
I20250624 02:14:24.384020 1288 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:24.385210 1272 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.005s sys 0.001s
I20250624 02:14:24.385561 1272 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "19358d5bb43a4a9bbc4d63f54fbe421d"
format_stamp: "Formatted at 2025-06-24 02:14:24 on dist-test-slave-5k9r"
I20250624 02:14:24.385906 1272 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:24.432839 1272 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:24.434340 1272 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:24.434794 1272 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:24.437356 1272 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:24.441536 1272 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:24.441751 1272 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:24.442046 1272 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:24.442260 1272 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:24.577319 1272 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:41161
I20250624 02:14:24.577440 1400 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:41161 every 8 connection(s)
I20250624 02:14:24.579933 1272 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:14:24.585862 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1272
I20250624 02:14:24.586313 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestDescribeTableColumnFlags.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance
I20250624 02:14:24.601132 1401 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46783
I20250624 02:14:24.601557 1401 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:24.602710 1401 heartbeater.cc:507] Master 127.31.42.254:46783 requested a full tablet report, sending...
I20250624 02:14:24.605240 946 ts_manager.cc:194] Registered new tserver with Master: 19358d5bb43a4a9bbc4d63f54fbe421d (127.31.42.195:41161)
I20250624 02:14:24.606660 946 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:36405
I20250624 02:14:24.620385 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:24.654333 946 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:35986:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
owner: "alice"
W20250624 02:14:24.673588 946 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table TestTable in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 02:14:24.725814 1336 tablet_service.cc:1468] Processing CreateTablet for tablet 685c4476fa424859bcc6ce80b85d1b3c (DEFAULT_TABLE table=TestTable [id=1e8477b5017741c6bf1775f4913173be]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:24.726135 1203 tablet_service.cc:1468] Processing CreateTablet for tablet 685c4476fa424859bcc6ce80b85d1b3c (DEFAULT_TABLE table=TestTable [id=1e8477b5017741c6bf1775f4913173be]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:24.728013 1336 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 685c4476fa424859bcc6ce80b85d1b3c. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:24.728214 1203 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 685c4476fa424859bcc6ce80b85d1b3c. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:24.732980 1069 tablet_service.cc:1468] Processing CreateTablet for tablet 685c4476fa424859bcc6ce80b85d1b3c (DEFAULT_TABLE table=TestTable [id=1e8477b5017741c6bf1775f4913173be]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:24.735162 1069 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 685c4476fa424859bcc6ce80b85d1b3c. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:24.764957 1421 tablet_bootstrap.cc:492] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: Bootstrap starting.
I20250624 02:14:24.768518 1420 tablet_bootstrap.cc:492] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: Bootstrap starting.
I20250624 02:14:24.773627 1421 tablet_bootstrap.cc:654] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:24.776306 1421 log.cc:826] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:24.779413 1422 tablet_bootstrap.cc:492] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: Bootstrap starting.
I20250624 02:14:24.780748 1420 tablet_bootstrap.cc:654] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:24.783404 1420 log.cc:826] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:24.787973 1421 tablet_bootstrap.cc:492] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: No bootstrap required, opened a new log
I20250624 02:14:24.788590 1421 ts_tablet_manager.cc:1397] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: Time spent bootstrapping tablet: real 0.024s user 0.009s sys 0.014s
I20250624 02:14:24.788630 1422 tablet_bootstrap.cc:654] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:24.792469 1422 log.cc:826] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:24.795359 1420 tablet_bootstrap.cc:492] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: No bootstrap required, opened a new log
I20250624 02:14:24.796146 1420 ts_tablet_manager.cc:1397] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: Time spent bootstrapping tablet: real 0.033s user 0.009s sys 0.015s
I20250624 02:14:24.810194 1422 tablet_bootstrap.cc:492] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: No bootstrap required, opened a new log
I20250624 02:14:24.810835 1422 ts_tablet_manager.cc:1397] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: Time spent bootstrapping tablet: real 0.032s user 0.010s sys 0.019s
I20250624 02:14:24.819751 1420 raft_consensus.cc:357] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.820518 1420 raft_consensus.cc:383] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:24.820043 1421 raft_consensus.cc:357] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.820788 1420 raft_consensus.cc:738] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 19358d5bb43a4a9bbc4d63f54fbe421d, State: Initialized, Role: FOLLOWER
I20250624 02:14:24.820946 1421 raft_consensus.cc:383] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:24.821241 1421 raft_consensus.cc:738] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f413e64320ad4155934538662f239e5e, State: Initialized, Role: FOLLOWER
I20250624 02:14:24.821651 1420 consensus_queue.cc:260] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.822176 1421 consensus_queue.cc:260] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.824946 1401 heartbeater.cc:499] Master 127.31.42.254:46783 was elected leader, sending a full tablet report...
I20250624 02:14:24.826375 1420 ts_tablet_manager.cc:1428] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: Time spent starting tablet: real 0.030s user 0.023s sys 0.006s
I20250624 02:14:24.831445 1421 ts_tablet_manager.cc:1428] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: Time spent starting tablet: real 0.042s user 0.037s sys 0.000s
W20250624 02:14:24.834674 1402 tablet.cc:2378] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:24.839357 1422 raft_consensus.cc:357] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.840356 1422 raft_consensus.cc:383] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:24.840785 1422 raft_consensus.cc:738] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9152818822e846d787eabcf497a88578, State: Initialized, Role: FOLLOWER
I20250624 02:14:24.841683 1422 consensus_queue.cc:260] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.845309 1422 ts_tablet_manager.cc:1428] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: Time spent starting tablet: real 0.034s user 0.027s sys 0.005s
I20250624 02:14:24.884552 1428 raft_consensus.cc:491] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:14:24.885106 1428 raft_consensus.cc:513] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.890098 1428 leader_election.cc:290] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 19358d5bb43a4a9bbc4d63f54fbe421d (127.31.42.195:41161), f413e64320ad4155934538662f239e5e (127.31.42.194:40735)
I20250624 02:14:24.901788 1356 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "685c4476fa424859bcc6ce80b85d1b3c" candidate_uuid: "9152818822e846d787eabcf497a88578" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" is_pre_election: true
I20250624 02:14:24.901806 1223 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "685c4476fa424859bcc6ce80b85d1b3c" candidate_uuid: "9152818822e846d787eabcf497a88578" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f413e64320ad4155934538662f239e5e" is_pre_election: true
I20250624 02:14:24.902626 1356 raft_consensus.cc:2466] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9152818822e846d787eabcf497a88578 in term 0.
I20250624 02:14:24.902679 1223 raft_consensus.cc:2466] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 9152818822e846d787eabcf497a88578 in term 0.
I20250624 02:14:24.903925 1025 leader_election.cc:304] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 19358d5bb43a4a9bbc4d63f54fbe421d, 9152818822e846d787eabcf497a88578; no voters:
I20250624 02:14:24.904793 1428 raft_consensus.cc:2802] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:14:24.905145 1428 raft_consensus.cc:491] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:14:24.905440 1428 raft_consensus.cc:3058] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:24.911971 1428 raft_consensus.cc:513] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.913823 1428 leader_election.cc:290] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [CANDIDATE]: Term 1 election: Requested vote from peers 19358d5bb43a4a9bbc4d63f54fbe421d (127.31.42.195:41161), f413e64320ad4155934538662f239e5e (127.31.42.194:40735)
I20250624 02:14:24.914767 1356 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "685c4476fa424859bcc6ce80b85d1b3c" candidate_uuid: "9152818822e846d787eabcf497a88578" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d"
I20250624 02:14:24.914920 1223 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "685c4476fa424859bcc6ce80b85d1b3c" candidate_uuid: "9152818822e846d787eabcf497a88578" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f413e64320ad4155934538662f239e5e"
I20250624 02:14:24.915328 1356 raft_consensus.cc:3058] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:24.915411 1223 raft_consensus.cc:3058] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:24.922479 1356 raft_consensus.cc:2466] T 685c4476fa424859bcc6ce80b85d1b3c P 19358d5bb43a4a9bbc4d63f54fbe421d [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9152818822e846d787eabcf497a88578 in term 1.
I20250624 02:14:24.922479 1223 raft_consensus.cc:2466] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 9152818822e846d787eabcf497a88578 in term 1.
I20250624 02:14:24.923715 1024 leader_election.cc:304] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 3 yes votes; 0 no votes. yes voters: 19358d5bb43a4a9bbc4d63f54fbe421d, 9152818822e846d787eabcf497a88578, f413e64320ad4155934538662f239e5e; no voters:
I20250624 02:14:24.924433 1428 raft_consensus.cc:2802] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:24.926002 1428 raft_consensus.cc:695] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [term 1 LEADER]: Becoming Leader. State: Replica: 9152818822e846d787eabcf497a88578, State: Running, Role: LEADER
I20250624 02:14:24.926864 1428 consensus_queue.cc:237] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } }
I20250624 02:14:24.937446 945 catalog_manager.cc:5582] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 reported cstate change: term changed from 0 to 1, leader changed from <none> to 9152818822e846d787eabcf497a88578 (127.31.42.193). New cstate: current_term: 1 leader_uuid: "9152818822e846d787eabcf497a88578" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } health_report { overall_health: HEALTHY } } }
W20250624 02:14:24.950977 1135 tablet.cc:2378] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:24.979034 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:24.982574 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 9152818822e846d787eabcf497a88578 to finish bootstrapping
I20250624 02:14:24.994998 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver f413e64320ad4155934538662f239e5e to finish bootstrapping
I20250624 02:14:25.007124 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 19358d5bb43a4a9bbc4d63f54fbe421d to finish bootstrapping
W20250624 02:14:25.008066 1269 tablet.cc:2378] T 685c4476fa424859bcc6ce80b85d1b3c P f413e64320ad4155934538662f239e5e: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:25.025331 945 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:35986:
name: "TestAnotherTable"
schema {
columns {
name: "foo"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "bar"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
comment: "comment for bar"
immutable: false
}
}
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "foo"
}
}
}
W20250624 02:14:25.027423 945 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table TestAnotherTable in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 02:14:25.048959 1069 tablet_service.cc:1468] Processing CreateTablet for tablet ade3038de36a4532966b8a414fb15824 (DEFAULT_TABLE table=TestAnotherTable [id=3edc2eb5d76f4ba89e18f0bb1763373c]), partition=RANGE (foo) PARTITION UNBOUNDED
I20250624 02:14:25.049477 1203 tablet_service.cc:1468] Processing CreateTablet for tablet ade3038de36a4532966b8a414fb15824 (DEFAULT_TABLE table=TestAnotherTable [id=3edc2eb5d76f4ba89e18f0bb1763373c]), partition=RANGE (foo) PARTITION UNBOUNDED
I20250624 02:14:25.050166 1069 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet ade3038de36a4532966b8a414fb15824. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:25.050741 1203 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet ade3038de36a4532966b8a414fb15824. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:25.060425 1336 tablet_service.cc:1468] Processing CreateTablet for tablet ade3038de36a4532966b8a414fb15824 (DEFAULT_TABLE table=TestAnotherTable [id=3edc2eb5d76f4ba89e18f0bb1763373c]), partition=RANGE (foo) PARTITION UNBOUNDED
I20250624 02:14:25.061615 1336 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet ade3038de36a4532966b8a414fb15824. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:25.065407 1422 tablet_bootstrap.cc:492] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578: Bootstrap starting.
I20250624 02:14:25.067503 1421 tablet_bootstrap.cc:492] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e: Bootstrap starting.
I20250624 02:14:25.072479 1421 tablet_bootstrap.cc:654] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:25.072572 1422 tablet_bootstrap.cc:654] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:25.074561 1420 tablet_bootstrap.cc:492] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d: Bootstrap starting.
I20250624 02:14:25.079185 1420 tablet_bootstrap.cc:654] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:25.080303 1422 tablet_bootstrap.cc:492] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578: No bootstrap required, opened a new log
I20250624 02:14:25.080736 1422 ts_tablet_manager.cc:1397] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578: Time spent bootstrapping tablet: real 0.016s user 0.006s sys 0.006s
I20250624 02:14:25.083030 1422 raft_consensus.cc:357] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.083530 1422 raft_consensus.cc:383] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:25.083801 1422 raft_consensus.cc:738] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9152818822e846d787eabcf497a88578, State: Initialized, Role: FOLLOWER
I20250624 02:14:25.085896 1422 consensus_queue.cc:260] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.087769 1422 ts_tablet_manager.cc:1428] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578: Time spent starting tablet: real 0.007s user 0.004s sys 0.000s
I20250624 02:14:25.091413 1421 tablet_bootstrap.cc:492] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e: No bootstrap required, opened a new log
I20250624 02:14:25.091919 1421 ts_tablet_manager.cc:1397] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e: Time spent bootstrapping tablet: real 0.025s user 0.010s sys 0.008s
I20250624 02:14:25.094359 1420 tablet_bootstrap.cc:492] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d: No bootstrap required, opened a new log
I20250624 02:14:25.094805 1420 ts_tablet_manager.cc:1397] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d: Time spent bootstrapping tablet: real 0.020s user 0.001s sys 0.016s
I20250624 02:14:25.094682 1421 raft_consensus.cc:357] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.095400 1421 raft_consensus.cc:383] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:25.095710 1421 raft_consensus.cc:738] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: f413e64320ad4155934538662f239e5e, State: Initialized, Role: FOLLOWER
I20250624 02:14:25.096403 1421 consensus_queue.cc:260] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.097576 1420 raft_consensus.cc:357] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.098412 1420 raft_consensus.cc:383] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:25.098676 1421 ts_tablet_manager.cc:1428] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e: Time spent starting tablet: real 0.006s user 0.006s sys 0.000s
I20250624 02:14:25.098727 1420 raft_consensus.cc:738] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 19358d5bb43a4a9bbc4d63f54fbe421d, State: Initialized, Role: FOLLOWER
I20250624 02:14:25.099485 1420 consensus_queue.cc:260] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.101918 1420 ts_tablet_manager.cc:1428] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d: Time spent starting tablet: real 0.007s user 0.005s sys 0.000s
I20250624 02:14:25.249250 1426 raft_consensus.cc:491] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:14:25.249742 1426 raft_consensus.cc:513] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.251998 1426 leader_election.cc:290] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 9152818822e846d787eabcf497a88578 (127.31.42.193:33637), f413e64320ad4155934538662f239e5e (127.31.42.194:40735)
I20250624 02:14:25.262698 1089 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "ade3038de36a4532966b8a414fb15824" candidate_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9152818822e846d787eabcf497a88578" is_pre_election: true
I20250624 02:14:25.263581 1089 raft_consensus.cc:2466] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 19358d5bb43a4a9bbc4d63f54fbe421d in term 0.
I20250624 02:14:25.264904 1291 leader_election.cc:304] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 19358d5bb43a4a9bbc4d63f54fbe421d, 9152818822e846d787eabcf497a88578; no voters:
I20250624 02:14:25.265167 1223 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "ade3038de36a4532966b8a414fb15824" candidate_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f413e64320ad4155934538662f239e5e" is_pre_election: true
I20250624 02:14:25.265611 1426 raft_consensus.cc:2802] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:14:25.265763 1223 raft_consensus.cc:2466] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 19358d5bb43a4a9bbc4d63f54fbe421d in term 0.
I20250624 02:14:25.265990 1426 raft_consensus.cc:491] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:14:25.266319 1426 raft_consensus.cc:3058] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:25.270870 1426 raft_consensus.cc:513] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.272262 1426 leader_election.cc:290] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [CANDIDATE]: Term 1 election: Requested vote from peers 9152818822e846d787eabcf497a88578 (127.31.42.193:33637), f413e64320ad4155934538662f239e5e (127.31.42.194:40735)
I20250624 02:14:25.273191 1223 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "ade3038de36a4532966b8a414fb15824" candidate_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "f413e64320ad4155934538662f239e5e"
I20250624 02:14:25.273187 1089 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "ade3038de36a4532966b8a414fb15824" candidate_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "9152818822e846d787eabcf497a88578"
I20250624 02:14:25.273595 1223 raft_consensus.cc:3058] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:25.273612 1089 raft_consensus.cc:3058] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:25.278282 1223 raft_consensus.cc:2466] T ade3038de36a4532966b8a414fb15824 P f413e64320ad4155934538662f239e5e [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 19358d5bb43a4a9bbc4d63f54fbe421d in term 1.
I20250624 02:14:25.278681 1089 raft_consensus.cc:2466] T ade3038de36a4532966b8a414fb15824 P 9152818822e846d787eabcf497a88578 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 19358d5bb43a4a9bbc4d63f54fbe421d in term 1.
I20250624 02:14:25.279114 1291 leader_election.cc:304] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 19358d5bb43a4a9bbc4d63f54fbe421d, f413e64320ad4155934538662f239e5e; no voters:
I20250624 02:14:25.279695 1426 raft_consensus.cc:2802] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:25.281335 1426 raft_consensus.cc:695] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [term 1 LEADER]: Becoming Leader. State: Replica: 19358d5bb43a4a9bbc4d63f54fbe421d, State: Running, Role: LEADER
I20250624 02:14:25.282261 1426 consensus_queue.cc:237] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } }
I20250624 02:14:25.292959 946 catalog_manager.cc:5582] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d reported cstate change: term changed from 0 to 1, leader changed from <none> to 19358d5bb43a4a9bbc4d63f54fbe421d (127.31.42.195). New cstate: current_term: 1 leader_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 } health_report { overall_health: HEALTHY } } }
I20250624 02:14:25.374020 1428 consensus_queue.cc:1035] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [LEADER]: Connected to new peer: Peer: permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:14:25.390501 1431 consensus_queue.cc:1035] T 685c4476fa424859bcc6ce80b85d1b3c P 9152818822e846d787eabcf497a88578 [LEADER]: Connected to new peer: Peer: permanent_uuid: "19358d5bb43a4a9bbc4d63f54fbe421d" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 41161 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.001s
W20250624 02:14:25.633459 1441 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:25.634110 1441 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:25.665889 1441 flags.cc:425] Enabled experimental flag: --enable_multi_tenancy=false
I20250624 02:14:25.693125 1426 consensus_queue.cc:1035] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [LEADER]: Connected to new peer: Peer: permanent_uuid: "f413e64320ad4155934538662f239e5e" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40735 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:14:25.705543 1453 consensus_queue.cc:1035] T ade3038de36a4532966b8a414fb15824 P 19358d5bb43a4a9bbc4d63f54fbe421d [LEADER]: Connected to new peer: Peer: permanent_uuid: "9152818822e846d787eabcf497a88578" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 33637 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.002s
W20250624 02:14:27.089017 1441 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.374s user 0.567s sys 0.805s
W20250624 02:14:27.089427 1441 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.375s user 0.567s sys 0.805s
W20250624 02:14:28.475123 1468 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:28.475709 1468 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:28.507092 1468 flags.cc:425] Enabled experimental flag: --enable_multi_tenancy=false
W20250624 02:14:29.830533 1468 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.281s user 0.528s sys 0.752s
W20250624 02:14:29.830839 1468 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.281s user 0.528s sys 0.752s
W20250624 02:14:31.210321 1484 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:31.210908 1484 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:31.245707 1484 flags.cc:425] Enabled experimental flag: --enable_multi_tenancy=false
W20250624 02:14:32.539845 1484 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.249s user 0.483s sys 0.765s
W20250624 02:14:32.540266 1484 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.250s user 0.483s sys 0.765s
W20250624 02:14:33.934367 1500 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:33.934962 1500 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:33.967907 1500 flags.cc:425] Enabled experimental flag: --enable_multi_tenancy=false
W20250624 02:14:35.295464 1500 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.284s user 0.544s sys 0.736s
W20250624 02:14:35.295771 1500 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.284s user 0.544s sys 0.736s
I20250624 02:14:36.379567 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1005
I20250624 02:14:36.407382 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1138
I20250624 02:14:36.435096 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1272
I20250624 02:14:36.464184 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 913
2025-06-24T02:14:36Z chronyd exiting
[ OK ] AdminCliTest.TestDescribeTableColumnFlags (19478 ms)
[ RUN ] AdminCliTest.TestAuthzResetCacheNotAuthorized
I20250624 02:14:36.526254 31915 test_util.cc:276] Using random seed: -472538481
I20250624 02:14:36.530519 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:14:36.530709 31915 ts_itest-base.cc:116] --------------
I20250624 02:14:36.530879 31915 ts_itest-base.cc:117] 3 tablet servers
I20250624 02:14:36.531024 31915 ts_itest-base.cc:118] 3 replicas per TS
I20250624 02:14:36.531159 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:14:36Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:14:36Z Disabled control of system clock
I20250624 02:14:36.570174 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:42041
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:37217
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:42041
--superuser_acl=no-such-user with env {}
W20250624 02:14:36.887835 1521 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:36.888410 1521 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:36.888870 1521 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:36.921090 1521 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:14:36.921401 1521 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:36.921618 1521 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:14:36.921814 1521 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:14:36.958307 1521 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37217
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:42041
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:42041
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--superuser_acl=<redacted>
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:36.959566 1521 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:36.961234 1521 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:36.976833 1527 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:36.976894 1530 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:36.978677 1528 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:36.979351 1521 server_base.cc:1048] running on GCE node
I20250624 02:14:38.162778 1521 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:38.165421 1521 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:38.166764 1521 hybrid_clock.cc:648] HybridClock initialized: now 1750731278166727 us; error 59 us; skew 500 ppm
I20250624 02:14:38.167538 1521 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:38.178076 1521 webserver.cc:469] Webserver started at http://127.31.42.254:43221/ using document root <none> and password file <none>
I20250624 02:14:38.179036 1521 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:38.179230 1521 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:38.179652 1521 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:38.184108 1521 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "b6e05ee5b2fc4165bd18e42864449346"
format_stamp: "Formatted at 2025-06-24 02:14:38 on dist-test-slave-5k9r"
I20250624 02:14:38.185163 1521 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "b6e05ee5b2fc4165bd18e42864449346"
format_stamp: "Formatted at 2025-06-24 02:14:38 on dist-test-slave-5k9r"
I20250624 02:14:38.192350 1521 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.009s sys 0.000s
I20250624 02:14:38.197922 1537 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:38.198993 1521 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.001s sys 0.002s
I20250624 02:14:38.199307 1521 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "b6e05ee5b2fc4165bd18e42864449346"
format_stamp: "Formatted at 2025-06-24 02:14:38 on dist-test-slave-5k9r"
I20250624 02:14:38.199630 1521 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:38.247651 1521 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:38.249110 1521 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:38.249547 1521 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:38.319147 1521 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:42041
I20250624 02:14:38.319226 1588 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:42041 every 8 connection(s)
I20250624 02:14:38.321789 1521 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:14:38.324101 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1521
I20250624 02:14:38.324656 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:14:38.327642 1589 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:38.353156 1589 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346: Bootstrap starting.
I20250624 02:14:38.358721 1589 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:38.360556 1589 log.cc:826] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:38.364730 1589 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346: No bootstrap required, opened a new log
I20250624 02:14:38.382143 1589 raft_consensus.cc:357] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b6e05ee5b2fc4165bd18e42864449346" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 42041 } }
I20250624 02:14:38.383062 1589 raft_consensus.cc:383] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:38.383338 1589 raft_consensus.cc:738] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b6e05ee5b2fc4165bd18e42864449346, State: Initialized, Role: FOLLOWER
I20250624 02:14:38.383960 1589 consensus_queue.cc:260] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b6e05ee5b2fc4165bd18e42864449346" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 42041 } }
I20250624 02:14:38.384436 1589 raft_consensus.cc:397] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:38.384660 1589 raft_consensus.cc:491] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:38.384919 1589 raft_consensus.cc:3058] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:38.388813 1589 raft_consensus.cc:513] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b6e05ee5b2fc4165bd18e42864449346" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 42041 } }
I20250624 02:14:38.389463 1589 leader_election.cc:304] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: b6e05ee5b2fc4165bd18e42864449346; no voters:
I20250624 02:14:38.391239 1589 leader_election.cc:290] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:38.391996 1594 raft_consensus.cc:2802] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:38.394085 1594 raft_consensus.cc:695] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [term 1 LEADER]: Becoming Leader. State: Replica: b6e05ee5b2fc4165bd18e42864449346, State: Running, Role: LEADER
I20250624 02:14:38.394980 1594 consensus_queue.cc:237] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b6e05ee5b2fc4165bd18e42864449346" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 42041 } }
I20250624 02:14:38.396862 1589 sys_catalog.cc:564] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:14:38.404453 1596 sys_catalog.cc:455] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "b6e05ee5b2fc4165bd18e42864449346" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b6e05ee5b2fc4165bd18e42864449346" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 42041 } } }
I20250624 02:14:38.405305 1596 sys_catalog.cc:458] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:38.405335 1595 sys_catalog.cc:455] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [sys.catalog]: SysCatalogTable state changed. Reason: New leader b6e05ee5b2fc4165bd18e42864449346. Latest consensus state: current_term: 1 leader_uuid: "b6e05ee5b2fc4165bd18e42864449346" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b6e05ee5b2fc4165bd18e42864449346" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 42041 } } }
I20250624 02:14:38.407384 1595 sys_catalog.cc:458] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346 [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:38.410282 1602 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:14:38.425459 1602 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:14:38.440099 1602 catalog_manager.cc:1349] Generated new cluster ID: d5dcb2aaffe445dbaa7bc899d0bbb5a1
I20250624 02:14:38.440415 1602 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:14:38.455575 1602 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:14:38.457007 1602 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:14:38.469206 1602 catalog_manager.cc:5955] T 00000000000000000000000000000000 P b6e05ee5b2fc4165bd18e42864449346: Generated new TSK 0
I20250624 02:14:38.470160 1602 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:14:38.484874 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:42041
--builtin_ntp_servers=127.31.42.212:37217
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
W20250624 02:14:38.790347 1613 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:38.790876 1613 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:38.791375 1613 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:38.822811 1613 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:38.823714 1613 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:14:38.858925 1613 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37217
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:42041
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:38.860201 1613 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:38.861790 1613 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:38.879500 1620 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:38.881138 1622 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:38.879550 1619 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:40.081436 1621 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1199 milliseconds
I20250624 02:14:40.081563 1613 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:40.082835 1613 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:40.085561 1613 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:40.087033 1613 hybrid_clock.cc:648] HybridClock initialized: now 1750731280086982 us; error 68 us; skew 500 ppm
I20250624 02:14:40.087818 1613 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:40.094677 1613 webserver.cc:469] Webserver started at http://127.31.42.193:45431/ using document root <none> and password file <none>
I20250624 02:14:40.095618 1613 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:40.095845 1613 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:40.096297 1613 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:40.100831 1613 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "79674f959e2e4086884e5625971210e9"
format_stamp: "Formatted at 2025-06-24 02:14:40 on dist-test-slave-5k9r"
I20250624 02:14:40.101969 1613 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "79674f959e2e4086884e5625971210e9"
format_stamp: "Formatted at 2025-06-24 02:14:40 on dist-test-slave-5k9r"
I20250624 02:14:40.109444 1613 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.001s
I20250624 02:14:40.116169 1629 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:40.117326 1613 fs_manager.cc:730] Time spent opening block manager: real 0.005s user 0.002s sys 0.004s
I20250624 02:14:40.117660 1613 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "79674f959e2e4086884e5625971210e9"
format_stamp: "Formatted at 2025-06-24 02:14:40 on dist-test-slave-5k9r"
I20250624 02:14:40.118041 1613 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:40.179068 1613 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:40.180554 1613 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:40.180977 1613 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:40.184069 1613 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:40.188511 1613 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:40.188716 1613 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:40.188995 1613 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:40.189152 1613 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:40.358698 1613 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:36445
I20250624 02:14:40.358805 1741 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:36445 every 8 connection(s)
I20250624 02:14:40.361251 1613 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:14:40.366366 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1613
I20250624 02:14:40.366767 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:14:40.379030 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:42041
--builtin_ntp_servers=127.31.42.212:37217
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:14:40.389719 1742 heartbeater.cc:344] Connected to a master server at 127.31.42.254:42041
I20250624 02:14:40.390311 1742 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:40.391628 1742 heartbeater.cc:507] Master 127.31.42.254:42041 requested a full tablet report, sending...
I20250624 02:14:40.394822 1554 ts_manager.cc:194] Registered new tserver with Master: 79674f959e2e4086884e5625971210e9 (127.31.42.193:36445)
I20250624 02:14:40.397823 1554 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:48143
W20250624 02:14:40.683980 1746 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:40.684477 1746 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:40.684976 1746 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:40.716516 1746 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:40.717453 1746 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:14:40.753690 1746 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37217
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:42041
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:40.755066 1746 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:40.756994 1746 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:40.779454 1752 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:41.402848 1742 heartbeater.cc:499] Master 127.31.42.254:42041 was elected leader, sending a full tablet report...
W20250624 02:14:42.175303 1751 debug-util.cc:398] Leaking SignalData structure 0x7b08000184e0 after lost signal to thread 1746
W20250624 02:14:42.588250 1746 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.807s user 0.630s sys 1.155s
W20250624 02:14:40.785738 1753 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:42.588640 1746 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.808s user 0.630s sys 1.156s
W20250624 02:14:42.590984 1755 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:42.593309 1754 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1808 milliseconds
I20250624 02:14:42.593369 1746 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:42.594570 1746 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:42.596625 1746 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:42.598004 1746 hybrid_clock.cc:648] HybridClock initialized: now 1750731282597956 us; error 48 us; skew 500 ppm
I20250624 02:14:42.598779 1746 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:42.604933 1746 webserver.cc:469] Webserver started at http://127.31.42.194:44393/ using document root <none> and password file <none>
I20250624 02:14:42.605844 1746 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:42.606112 1746 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:42.606545 1746 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:42.610880 1746 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/instance:
uuid: "125a2b6348d24273b09689abf347a21f"
format_stamp: "Formatted at 2025-06-24 02:14:42 on dist-test-slave-5k9r"
I20250624 02:14:42.611979 1746 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance:
uuid: "125a2b6348d24273b09689abf347a21f"
format_stamp: "Formatted at 2025-06-24 02:14:42 on dist-test-slave-5k9r"
I20250624 02:14:42.619120 1746 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.002s
I20250624 02:14:42.624608 1762 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:42.625586 1746 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.001s sys 0.001s
I20250624 02:14:42.625921 1746 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "125a2b6348d24273b09689abf347a21f"
format_stamp: "Formatted at 2025-06-24 02:14:42 on dist-test-slave-5k9r"
I20250624 02:14:42.626266 1746 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:42.678113 1746 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:42.679594 1746 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:42.680034 1746 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:42.682549 1746 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:42.686615 1746 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:42.686849 1746 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:42.687100 1746 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:42.687258 1746 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:42.824141 1746 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:35057
I20250624 02:14:42.824249 1875 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:35057 every 8 connection(s)
I20250624 02:14:42.826773 1746 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:14:42.833676 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1746
I20250624 02:14:42.834317 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance
I20250624 02:14:42.840934 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:42041
--builtin_ntp_servers=127.31.42.212:37217
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:14:42.848080 1876 heartbeater.cc:344] Connected to a master server at 127.31.42.254:42041
I20250624 02:14:42.848497 1876 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:42.849610 1876 heartbeater.cc:507] Master 127.31.42.254:42041 requested a full tablet report, sending...
I20250624 02:14:42.852187 1554 ts_manager.cc:194] Registered new tserver with Master: 125a2b6348d24273b09689abf347a21f (127.31.42.194:35057)
I20250624 02:14:42.853511 1554 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:33067
W20250624 02:14:43.173853 1880 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:43.174407 1880 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:43.174899 1880 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:43.206624 1880 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:43.207463 1880 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:14:43.241762 1880 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37217
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:42041
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:43.243060 1880 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:43.244663 1880 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:43.260268 1886 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:43.857429 1876 heartbeater.cc:499] Master 127.31.42.254:42041 was elected leader, sending a full tablet report...
W20250624 02:14:44.082907 1872 debug-util.cc:398] Leaking SignalData structure 0x7b08000ac060 after lost signal to thread 1747
W20250624 02:14:44.083842 1872 debug-util.cc:398] Leaking SignalData structure 0x7b08000acf60 after lost signal to thread 1875
W20250624 02:14:43.260614 1887 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:43.262894 1880 server_base.cc:1048] running on GCE node
W20250624 02:14:43.261744 1889 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:44.464418 1880 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:44.467192 1880 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:44.468616 1880 hybrid_clock.cc:648] HybridClock initialized: now 1750731284468565 us; error 71 us; skew 500 ppm
I20250624 02:14:44.469422 1880 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:44.484686 1880 webserver.cc:469] Webserver started at http://127.31.42.195:36049/ using document root <none> and password file <none>
I20250624 02:14:44.485644 1880 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:44.485850 1880 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:44.486359 1880 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:44.492189 1880 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/instance:
uuid: "bd3ced2853a24d47abc5cb11c8e839ac"
format_stamp: "Formatted at 2025-06-24 02:14:44 on dist-test-slave-5k9r"
I20250624 02:14:44.493279 1880 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance:
uuid: "bd3ced2853a24d47abc5cb11c8e839ac"
format_stamp: "Formatted at 2025-06-24 02:14:44 on dist-test-slave-5k9r"
I20250624 02:14:44.500906 1880 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.009s sys 0.000s
I20250624 02:14:44.506840 1896 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:44.507970 1880 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20250624 02:14:44.508375 1880 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "bd3ced2853a24d47abc5cb11c8e839ac"
format_stamp: "Formatted at 2025-06-24 02:14:44 on dist-test-slave-5k9r"
I20250624 02:14:44.508806 1880 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:44.577920 1880 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:44.579445 1880 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:44.579874 1880 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:44.582911 1880 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:44.587352 1880 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:44.587572 1880 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:44.587817 1880 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:44.587990 1880 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:44.759627 1880 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:39565
I20250624 02:14:44.759912 2008 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:39565 every 8 connection(s)
I20250624 02:14:44.763059 1880 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:14:44.772320 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 1880
I20250624 02:14:44.772930 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestAuthzResetCacheNotAuthorized.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance
I20250624 02:14:44.801131 2009 heartbeater.cc:344] Connected to a master server at 127.31.42.254:42041
I20250624 02:14:44.801692 2009 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:44.803120 2009 heartbeater.cc:507] Master 127.31.42.254:42041 requested a full tablet report, sending...
I20250624 02:14:44.805559 1554 ts_manager.cc:194] Registered new tserver with Master: bd3ced2853a24d47abc5cb11c8e839ac (127.31.42.195:39565)
I20250624 02:14:44.806821 1554 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:36947
I20250624 02:14:44.810788 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:44.847389 1554 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:44128:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
owner: "alice"
W20250624 02:14:44.867658 1554 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table TestTable in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 02:14:44.941690 1944 tablet_service.cc:1468] Processing CreateTablet for tablet 5699353d69574d36b5167a3306cc269d (DEFAULT_TABLE table=TestTable [id=f28acbfacb0b453480b946afbba6130e]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:44.946862 1944 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5699353d69574d36b5167a3306cc269d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:44.948176 1677 tablet_service.cc:1468] Processing CreateTablet for tablet 5699353d69574d36b5167a3306cc269d (DEFAULT_TABLE table=TestTable [id=f28acbfacb0b453480b946afbba6130e]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:44.950250 1677 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5699353d69574d36b5167a3306cc269d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:44.954377 1811 tablet_service.cc:1468] Processing CreateTablet for tablet 5699353d69574d36b5167a3306cc269d (DEFAULT_TABLE table=TestTable [id=f28acbfacb0b453480b946afbba6130e]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:44.956373 1811 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 5699353d69574d36b5167a3306cc269d. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:44.994464 2029 tablet_bootstrap.cc:492] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: Bootstrap starting.
I20250624 02:14:44.995359 2028 tablet_bootstrap.cc:492] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: Bootstrap starting.
I20250624 02:14:44.996583 2030 tablet_bootstrap.cc:492] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: Bootstrap starting.
I20250624 02:14:45.004444 2030 tablet_bootstrap.cc:654] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:45.004442 2028 tablet_bootstrap.cc:654] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:45.005095 2029 tablet_bootstrap.cc:654] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:45.007048 2030 log.cc:826] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:45.007479 2029 log.cc:826] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:45.007505 2028 log.cc:826] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:45.018714 2030 tablet_bootstrap.cc:492] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: No bootstrap required, opened a new log
I20250624 02:14:45.019371 2030 ts_tablet_manager.cc:1397] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: Time spent bootstrapping tablet: real 0.023s user 0.006s sys 0.015s
I20250624 02:14:45.022360 2028 tablet_bootstrap.cc:492] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: No bootstrap required, opened a new log
I20250624 02:14:45.023167 2028 ts_tablet_manager.cc:1397] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: Time spent bootstrapping tablet: real 0.029s user 0.016s sys 0.004s
I20250624 02:14:45.031710 2029 tablet_bootstrap.cc:492] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: No bootstrap required, opened a new log
I20250624 02:14:45.032335 2029 ts_tablet_manager.cc:1397] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: Time spent bootstrapping tablet: real 0.047s user 0.015s sys 0.019s
I20250624 02:14:45.049185 2030 raft_consensus.cc:357] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.050384 2030 raft_consensus.cc:383] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:45.050796 2030 raft_consensus.cc:738] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 125a2b6348d24273b09689abf347a21f, State: Initialized, Role: FOLLOWER
I20250624 02:14:45.051970 2030 consensus_queue.cc:260] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.052270 2028 raft_consensus.cc:357] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.053347 2028 raft_consensus.cc:383] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:45.053694 2028 raft_consensus.cc:738] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 79674f959e2e4086884e5625971210e9, State: Initialized, Role: FOLLOWER
I20250624 02:14:45.054857 2028 consensus_queue.cc:260] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.062645 2028 ts_tablet_manager.cc:1428] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: Time spent starting tablet: real 0.039s user 0.024s sys 0.012s
I20250624 02:14:45.063833 2029 raft_consensus.cc:357] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.064777 2029 raft_consensus.cc:383] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:45.065092 2029 raft_consensus.cc:738] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: bd3ced2853a24d47abc5cb11c8e839ac, State: Initialized, Role: FOLLOWER
I20250624 02:14:45.066207 2029 consensus_queue.cc:260] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.077899 2009 heartbeater.cc:499] Master 127.31.42.254:42041 was elected leader, sending a full tablet report...
I20250624 02:14:45.079916 2029 ts_tablet_manager.cc:1428] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: Time spent starting tablet: real 0.047s user 0.015s sys 0.023s
I20250624 02:14:45.083211 2035 raft_consensus.cc:491] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:14:45.085616 2035 raft_consensus.cc:513] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.089183 2030 ts_tablet_manager.cc:1428] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: Time spent starting tablet: real 0.069s user 0.024s sys 0.032s
I20250624 02:14:45.094238 2035 leader_election.cc:290] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 125a2b6348d24273b09689abf347a21f (127.31.42.194:35057), bd3ced2853a24d47abc5cb11c8e839ac (127.31.42.195:39565)
I20250624 02:14:45.111075 1964 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "5699353d69574d36b5167a3306cc269d" candidate_uuid: "79674f959e2e4086884e5625971210e9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" is_pre_election: true
I20250624 02:14:45.111982 1964 raft_consensus.cc:2466] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 79674f959e2e4086884e5625971210e9 in term 0.
I20250624 02:14:45.113448 1631 leader_election.cc:304] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 79674f959e2e4086884e5625971210e9, bd3ced2853a24d47abc5cb11c8e839ac; no voters:
I20250624 02:14:45.114399 2035 raft_consensus.cc:2802] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:14:45.114863 2035 raft_consensus.cc:491] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:14:45.115262 2035 raft_consensus.cc:3058] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:45.117456 1831 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "5699353d69574d36b5167a3306cc269d" candidate_uuid: "79674f959e2e4086884e5625971210e9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "125a2b6348d24273b09689abf347a21f" is_pre_election: true
I20250624 02:14:45.118296 1831 raft_consensus.cc:2466] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 79674f959e2e4086884e5625971210e9 in term 0.
W20250624 02:14:45.121220 1743 tablet.cc:2378] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:45.123196 2035 raft_consensus.cc:513] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.125222 2035 leader_election.cc:290] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [CANDIDATE]: Term 1 election: Requested vote from peers 125a2b6348d24273b09689abf347a21f (127.31.42.194:35057), bd3ced2853a24d47abc5cb11c8e839ac (127.31.42.195:39565)
I20250624 02:14:45.126492 1964 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "5699353d69574d36b5167a3306cc269d" candidate_uuid: "79674f959e2e4086884e5625971210e9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "bd3ced2853a24d47abc5cb11c8e839ac"
I20250624 02:14:45.127090 1964 raft_consensus.cc:3058] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:45.126024 1831 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "5699353d69574d36b5167a3306cc269d" candidate_uuid: "79674f959e2e4086884e5625971210e9" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "125a2b6348d24273b09689abf347a21f"
I20250624 02:14:45.127898 1831 raft_consensus.cc:3058] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:45.135967 1831 raft_consensus.cc:2466] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 79674f959e2e4086884e5625971210e9 in term 1.
I20250624 02:14:45.137492 1964 raft_consensus.cc:2466] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 79674f959e2e4086884e5625971210e9 in term 1.
I20250624 02:14:45.137399 1633 leader_election.cc:304] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 125a2b6348d24273b09689abf347a21f, 79674f959e2e4086884e5625971210e9; no voters:
I20250624 02:14:45.138491 2035 raft_consensus.cc:2802] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:45.139050 2035 raft_consensus.cc:695] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [term 1 LEADER]: Becoming Leader. State: Replica: 79674f959e2e4086884e5625971210e9, State: Running, Role: LEADER
I20250624 02:14:45.140107 2035 consensus_queue.cc:237] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } }
I20250624 02:14:45.152601 1554 catalog_manager.cc:5582] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 reported cstate change: term changed from 0 to 1, leader changed from <none> to 79674f959e2e4086884e5625971210e9 (127.31.42.193). New cstate: current_term: 1 leader_uuid: "79674f959e2e4086884e5625971210e9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "79674f959e2e4086884e5625971210e9" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 36445 } health_report { overall_health: HEALTHY } } }
I20250624 02:14:45.223691 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:45.227180 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 79674f959e2e4086884e5625971210e9 to finish bootstrapping
I20250624 02:14:45.241151 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 125a2b6348d24273b09689abf347a21f to finish bootstrapping
I20250624 02:14:45.253878 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver bd3ced2853a24d47abc5cb11c8e839ac to finish bootstrapping
W20250624 02:14:45.274665 2010 tablet.cc:2378] T 5699353d69574d36b5167a3306cc269d P bd3ced2853a24d47abc5cb11c8e839ac: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 02:14:45.334228 1877 tablet.cc:2378] T 5699353d69574d36b5167a3306cc269d P 125a2b6348d24273b09689abf347a21f: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:14:45.582837 2035 consensus_queue.cc:1035] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "bd3ced2853a24d47abc5cb11c8e839ac" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 39565 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:14:45.619800 2035 consensus_queue.cc:1035] T 5699353d69574d36b5167a3306cc269d P 79674f959e2e4086884e5625971210e9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "125a2b6348d24273b09689abf347a21f" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35057 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
W20250624 02:14:47.034560 1554 server_base.cc:1130] Unauthorized access attempt to method kudu.master.MasterService.RefreshAuthzCache from {username='slave'} at 127.0.0.1:44136
I20250624 02:14:48.065147 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1613
I20250624 02:14:48.091842 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1746
I20250624 02:14:48.122740 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1880
I20250624 02:14:48.149566 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 1521
2025-06-24T02:14:48Z chronyd exiting
[ OK ] AdminCliTest.TestAuthzResetCacheNotAuthorized (11675 ms)
[ RUN ] AdminCliTest.TestRebuildTables
I20250624 02:14:48.201476 31915 test_util.cc:276] Using random seed: -460863254
I20250624 02:14:48.205520 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:14:48.205696 31915 ts_itest-base.cc:116] --------------
I20250624 02:14:48.205862 31915 ts_itest-base.cc:117] 3 tablet servers
I20250624 02:14:48.206053 31915 ts_itest-base.cc:118] 3 replicas per TS
I20250624 02:14:48.206197 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:14:48Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:14:48Z Disabled control of system clock
I20250624 02:14:48.244777 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:39985
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:39985 with env {}
W20250624 02:14:48.547266 2078 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:48.547830 2078 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:48.548239 2078 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:48.580087 2078 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:14:48.580376 2078 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:48.580581 2078 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:14:48.580785 2078 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:14:48.616315 2078 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:39985
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:39985
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:48.617574 2078 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:48.619261 2078 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:48.635529 2084 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:48.635612 2085 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:48.637846 2078 server_base.cc:1048] running on GCE node
W20250624 02:14:48.637068 2087 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:49.823518 2078 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:49.826691 2078 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:49.828146 2078 hybrid_clock.cc:648] HybridClock initialized: now 1750731289828107 us; error 54 us; skew 500 ppm
I20250624 02:14:49.828980 2078 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:49.838301 2078 webserver.cc:469] Webserver started at http://127.31.42.254:36667/ using document root <none> and password file <none>
I20250624 02:14:49.839258 2078 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:49.839463 2078 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:49.839942 2078 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:49.844408 2078 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "647b3bf07b6d4b56a97941e98dcfc165"
format_stamp: "Formatted at 2025-06-24 02:14:49 on dist-test-slave-5k9r"
I20250624 02:14:49.845515 2078 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "647b3bf07b6d4b56a97941e98dcfc165"
format_stamp: "Formatted at 2025-06-24 02:14:49 on dist-test-slave-5k9r"
I20250624 02:14:49.852689 2078 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.005s sys 0.000s
I20250624 02:14:49.858318 2094 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:49.859347 2078 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20250624 02:14:49.859683 2078 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "647b3bf07b6d4b56a97941e98dcfc165"
format_stamp: "Formatted at 2025-06-24 02:14:49 on dist-test-slave-5k9r"
I20250624 02:14:49.859993 2078 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:49.906693 2078 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:49.908146 2078 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:49.908581 2078 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:49.978513 2078 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:39985
I20250624 02:14:49.978595 2145 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:39985 every 8 connection(s)
I20250624 02:14:49.981159 2078 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:14:49.986389 2146 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:49.987136 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2078
I20250624 02:14:49.987618 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:14:50.009565 2146 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap starting.
I20250624 02:14:50.015336 2146 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:50.016974 2146 log.cc:826] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:50.021304 2146 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: No bootstrap required, opened a new log
I20250624 02:14:50.039369 2146 raft_consensus.cc:357] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:14:50.040035 2146 raft_consensus.cc:383] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:50.040274 2146 raft_consensus.cc:738] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Initialized, Role: FOLLOWER
I20250624 02:14:50.040952 2146 consensus_queue.cc:260] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:14:50.041438 2146 raft_consensus.cc:397] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:50.041723 2146 raft_consensus.cc:491] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:50.042086 2146 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:50.046068 2146 raft_consensus.cc:513] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:14:50.046757 2146 leader_election.cc:304] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 647b3bf07b6d4b56a97941e98dcfc165; no voters:
I20250624 02:14:50.048430 2146 leader_election.cc:290] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:50.049177 2151 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:50.051839 2151 raft_consensus.cc:695] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 1 LEADER]: Becoming Leader. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Running, Role: LEADER
I20250624 02:14:50.052631 2146 sys_catalog.cc:564] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:14:50.052685 2151 consensus_queue.cc:237] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:14:50.063999 2152 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:14:50.064942 2152 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:50.066555 2153 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 647b3bf07b6d4b56a97941e98dcfc165. Latest consensus state: current_term: 1 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:14:50.067423 2153 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:14:50.067610 2160 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:14:50.082856 2160 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:14:50.099296 2160 catalog_manager.cc:1349] Generated new cluster ID: 5528683bbe9a41fc8b42fa74b6850b6d
I20250624 02:14:50.099711 2160 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:14:50.119287 2160 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:14:50.120718 2160 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:14:50.134543 2160 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Generated new TSK 0
I20250624 02:14:50.135623 2160 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:14:50.156996 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:39985
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
W20250624 02:14:50.457894 2170 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:50.458456 2170 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:50.458949 2170 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:50.490182 2170 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:50.491067 2170 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:14:50.527312 2170 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:50.528577 2170 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:50.530227 2170 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:50.547214 2177 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:50.548871 2179 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:50.547220 2176 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:52.138346 2178 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1589 milliseconds
I20250624 02:14:52.138463 2170 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:52.139750 2170 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:52.142541 2170 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:52.144014 2170 hybrid_clock.cc:648] HybridClock initialized: now 1750731292143969 us; error 63 us; skew 500 ppm
I20250624 02:14:52.144840 2170 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:52.151865 2170 webserver.cc:469] Webserver started at http://127.31.42.193:38599/ using document root <none> and password file <none>
I20250624 02:14:52.152805 2170 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:52.153021 2170 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:52.153491 2170 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:52.157967 2170 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
format_stamp: "Formatted at 2025-06-24 02:14:52 on dist-test-slave-5k9r"
I20250624 02:14:52.159126 2170 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
format_stamp: "Formatted at 2025-06-24 02:14:52 on dist-test-slave-5k9r"
I20250624 02:14:52.166433 2170 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.001s
I20250624 02:14:52.172933 2186 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:52.174091 2170 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.007s sys 0.000s
I20250624 02:14:52.174388 2170 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
format_stamp: "Formatted at 2025-06-24 02:14:52 on dist-test-slave-5k9r"
I20250624 02:14:52.174703 2170 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:52.239380 2170 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:52.240809 2170 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:52.241199 2170 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:52.243645 2170 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:52.247674 2170 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:52.247880 2170 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:52.248080 2170 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:52.248211 2170 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:52.384632 2170 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:39677
I20250624 02:14:52.384760 2298 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:39677 every 8 connection(s)
I20250624 02:14:52.387210 2170 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:14:52.392297 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2170
I20250624 02:14:52.392882 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:14:52.399171 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:39985
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:14:52.409581 2299 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:14:52.410048 2299 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:52.411068 2299 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:14:52.413450 2111 ts_manager.cc:194] Registered new tserver with Master: a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
I20250624 02:14:52.415467 2111 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:54613
W20250624 02:14:52.698292 2303 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:52.698817 2303 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:52.699333 2303 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:52.730852 2303 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:52.731758 2303 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:14:52.770340 2303 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:52.771631 2303 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:52.773305 2303 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:52.788681 2311 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:53.418766 2299 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
W20250624 02:14:52.789990 2313 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:52.792097 2303 server_base.cc:1048] running on GCE node
W20250624 02:14:52.790448 2310 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:53.958791 2303 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:53.961678 2303 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:53.963207 2303 hybrid_clock.cc:648] HybridClock initialized: now 1750731293963169 us; error 79 us; skew 500 ppm
I20250624 02:14:53.964062 2303 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:53.970489 2303 webserver.cc:469] Webserver started at http://127.31.42.194:39475/ using document root <none> and password file <none>
I20250624 02:14:53.971572 2303 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:53.971786 2303 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:53.972249 2303 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:53.976785 2303 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/instance:
uuid: "6431b2b3e46e4e199cfb1609d7c42607"
format_stamp: "Formatted at 2025-06-24 02:14:53 on dist-test-slave-5k9r"
I20250624 02:14:53.977983 2303 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance:
uuid: "6431b2b3e46e4e199cfb1609d7c42607"
format_stamp: "Formatted at 2025-06-24 02:14:53 on dist-test-slave-5k9r"
I20250624 02:14:53.985930 2303 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.001s
I20250624 02:14:53.992648 2320 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:53.993821 2303 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.006s sys 0.001s
I20250624 02:14:53.994217 2303 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "6431b2b3e46e4e199cfb1609d7c42607"
format_stamp: "Formatted at 2025-06-24 02:14:53 on dist-test-slave-5k9r"
I20250624 02:14:53.994535 2303 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:54.046896 2303 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:54.048336 2303 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:54.048734 2303 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:54.051366 2303 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:54.056306 2303 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:54.056511 2303 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:54.056715 2303 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:54.056855 2303 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:54.195665 2303 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:35121
I20250624 02:14:54.195771 2433 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:35121 every 8 connection(s)
I20250624 02:14:54.198218 2303 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:14:54.203732 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2303
I20250624 02:14:54.204448 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance
I20250624 02:14:54.210881 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:39985
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:14:54.220422 2434 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:14:54.220865 2434 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:54.221865 2434 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:14:54.224130 2111 ts_manager.cc:194] Registered new tserver with Master: 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121)
I20250624 02:14:54.225364 2111 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:42681
W20250624 02:14:54.520208 2438 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:14:54.520717 2438 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:14:54.521201 2438 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:14:54.552067 2438 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:14:54.553076 2438 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:14:54.588215 2438 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:14:54.589538 2438 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:14:54.591272 2438 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:14:54.607983 2444 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:14:55.228722 2434 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
W20250624 02:14:54.608009 2445 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:54.608361 2447 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:14:55.781813 2446 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:14:55.781919 2438 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:14:55.785606 2438 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:14:55.787865 2438 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:14:55.789243 2438 hybrid_clock.cc:648] HybridClock initialized: now 1750731295789207 us; error 55 us; skew 500 ppm
I20250624 02:14:55.790112 2438 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:14:55.796614 2438 webserver.cc:469] Webserver started at http://127.31.42.195:39881/ using document root <none> and password file <none>
I20250624 02:14:55.797545 2438 fs_manager.cc:362] Metadata directory not provided
I20250624 02:14:55.797781 2438 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:14:55.798250 2438 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:14:55.802733 2438 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/instance:
uuid: "5a801f07eaeb46368d7fc28f51b9e1d6"
format_stamp: "Formatted at 2025-06-24 02:14:55 on dist-test-slave-5k9r"
I20250624 02:14:55.803838 2438 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance:
uuid: "5a801f07eaeb46368d7fc28f51b9e1d6"
format_stamp: "Formatted at 2025-06-24 02:14:55 on dist-test-slave-5k9r"
I20250624 02:14:55.811342 2438 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.002s
I20250624 02:14:55.817095 2454 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:55.818224 2438 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 02:14:55.818550 2438 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "5a801f07eaeb46368d7fc28f51b9e1d6"
format_stamp: "Formatted at 2025-06-24 02:14:55 on dist-test-slave-5k9r"
I20250624 02:14:55.818889 2438 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:14:55.878340 2438 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:14:55.879837 2438 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:14:55.880268 2438 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:14:55.882987 2438 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:14:55.887271 2438 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:14:55.887492 2438 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:55.887742 2438 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:14:55.887892 2438 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:14:56.028976 2438 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:42985
I20250624 02:14:56.029088 2566 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:42985 every 8 connection(s)
I20250624 02:14:56.031493 2438 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:14:56.037549 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2438
I20250624 02:14:56.038017 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance
I20250624 02:14:56.053182 2567 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:14:56.053578 2567 heartbeater.cc:461] Registering TS with master...
I20250624 02:14:56.054605 2567 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:14:56.056951 2111 ts_manager.cc:194] Registered new tserver with Master: 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:14:56.058256 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:56.058473 2111 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:56053
I20250624 02:14:56.093420 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:14:56.093735 31915 test_util.cc:276] Using random seed: -452970985
I20250624 02:14:56.135658 2111 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:49666:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 1
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20250624 02:14:56.177392 2234 tablet_service.cc:1468] Processing CreateTablet for tablet 45341658889647099a604223ab78b6db (DEFAULT_TABLE table=TestTable [id=6485e641c3ea4f02b595d4d146aa9315]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:56.178889 2234 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 45341658889647099a604223ab78b6db. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:56.199831 2587 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap starting.
I20250624 02:14:56.205363 2587 tablet_bootstrap.cc:654] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:56.207356 2587 log.cc:826] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:56.212184 2587 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: No bootstrap required, opened a new log
I20250624 02:14:56.212570 2587 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Time spent bootstrapping tablet: real 0.013s user 0.000s sys 0.010s
I20250624 02:14:56.229923 2587 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:14:56.230543 2587 raft_consensus.cc:383] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:56.230752 2587 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Initialized, Role: FOLLOWER
I20250624 02:14:56.231371 2587 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:14:56.231840 2587 raft_consensus.cc:397] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:56.232079 2587 raft_consensus.cc:491] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:56.232333 2587 raft_consensus.cc:3058] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:56.237824 2587 raft_consensus.cc:513] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:14:56.238582 2587 leader_election.cc:304] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: a419c27a5f5b4eaa819ede2e09199dc0; no voters:
I20250624 02:14:56.240288 2587 leader_election.cc:290] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:56.240669 2589 raft_consensus.cc:2802] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:56.243069 2589 raft_consensus.cc:695] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 LEADER]: Becoming Leader. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Running, Role: LEADER
I20250624 02:14:56.243942 2589 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:14:56.244509 2587 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Time spent starting tablet: real 0.032s user 0.030s sys 0.003s
I20250624 02:14:56.257383 2111 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: term changed from 0 to 1, leader changed from <none> to a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193). New cstate: current_term: 1 leader_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } }
I20250624 02:14:56.459185 31915 test_util.cc:276] Using random seed: -452605550
I20250624 02:14:56.480719 2111 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:49672:
name: "TestTable1"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 1
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20250624 02:14:56.509855 2502 tablet_service.cc:1468] Processing CreateTablet for tablet 529e58fc77b545c388e07e325ebb525a (DEFAULT_TABLE table=TestTable1 [id=6df87e5f16424f788b9d39484a0f8e57]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:56.511490 2502 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 529e58fc77b545c388e07e325ebb525a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:56.530275 2608 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap starting.
I20250624 02:14:56.536151 2608 tablet_bootstrap.cc:654] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:56.537874 2608 log.cc:826] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:56.542239 2608 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: No bootstrap required, opened a new log
I20250624 02:14:56.542658 2608 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent bootstrapping tablet: real 0.013s user 0.008s sys 0.004s
I20250624 02:14:56.560535 2608 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:14:56.561126 2608 raft_consensus.cc:383] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:56.561353 2608 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Initialized, Role: FOLLOWER
I20250624 02:14:56.562111 2608 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:14:56.562683 2608 raft_consensus.cc:397] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:56.562963 2608 raft_consensus.cc:491] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:56.563248 2608 raft_consensus.cc:3058] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:56.567363 2608 raft_consensus.cc:513] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:14:56.568194 2608 leader_election.cc:304] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 5a801f07eaeb46368d7fc28f51b9e1d6; no voters:
I20250624 02:14:56.570025 2608 leader_election.cc:290] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:56.570559 2610 raft_consensus.cc:2802] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:56.573205 2567 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
I20250624 02:14:56.573457 2610 raft_consensus.cc:695] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 LEADER]: Becoming Leader. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Running, Role: LEADER
I20250624 02:14:56.574337 2608 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent starting tablet: real 0.031s user 0.027s sys 0.004s
I20250624 02:14:56.574431 2610 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:14:56.585489 2111 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 reported cstate change: term changed from 0 to 1, leader changed from <none> to 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195). New cstate: current_term: 1 leader_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } health_report { overall_health: HEALTHY } } }
I20250624 02:14:56.784899 31915 test_util.cc:276] Using random seed: -452279836
I20250624 02:14:56.806419 2109 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:49688:
name: "TestTable2"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 1
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
I20250624 02:14:56.834594 2368 tablet_service.cc:1468] Processing CreateTablet for tablet dde7659b8edb4c1ca042d33a2906f7f1 (DEFAULT_TABLE table=TestTable2 [id=84a311cc03c74208a3ed78497bc3a41d]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:14:56.835989 2368 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet dde7659b8edb4c1ca042d33a2906f7f1. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:14:56.854770 2629 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:14:56.860683 2629 tablet_bootstrap.cc:654] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Neither blocks nor log segments found. Creating new log.
I20250624 02:14:56.862382 2629 log.cc:826] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Log is configured to *not* fsync() on all Append() calls
I20250624 02:14:56.866806 2629 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: No bootstrap required, opened a new log
I20250624 02:14:56.867214 2629 ts_tablet_manager.cc:1397] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.013s user 0.008s sys 0.003s
I20250624 02:14:56.885041 2629 raft_consensus.cc:357] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:14:56.885587 2629 raft_consensus.cc:383] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:14:56.885795 2629 raft_consensus.cc:738] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: FOLLOWER
I20250624 02:14:56.886461 2629 consensus_queue.cc:260] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:14:56.886958 2629 raft_consensus.cc:397] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:14:56.887207 2629 raft_consensus.cc:491] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:14:56.887468 2629 raft_consensus.cc:3058] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:14:56.891649 2629 raft_consensus.cc:513] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:14:56.892308 2629 leader_election.cc:304] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607; no voters:
I20250624 02:14:56.894352 2629 leader_election.cc:290] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:14:56.894721 2631 raft_consensus.cc:2802] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:14:56.897058 2631 raft_consensus.cc:695] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 LEADER]: Becoming Leader. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Running, Role: LEADER
I20250624 02:14:56.897866 2631 consensus_queue.cc:237] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:14:56.899334 2629 ts_tablet_manager.cc:1428] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.032s user 0.020s sys 0.012s
I20250624 02:14:56.909364 2109 catalog_manager.cc:5582] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: term changed from 0 to 1, leader changed from <none> to 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194). New cstate: current_term: 1 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } health_report { overall_health: HEALTHY } } }
I20250624 02:14:57.139429 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2078
W20250624 02:14:57.272735 2299 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:39985 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:39985: connect: Connection refused (error 111)
W20250624 02:14:57.602694 2567 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:39985 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:39985: connect: Connection refused (error 111)
W20250624 02:14:57.925920 2434 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:39985 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:39985: connect: Connection refused (error 111)
I20250624 02:15:02.041354 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2170
I20250624 02:15:02.075409 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2303
I20250624 02:15:02.105604 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2438
I20250624 02:15:02.135257 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:39985
--webserver_interface=127.31.42.254
--webserver_port=36667
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:39985 with env {}
W20250624 02:15:02.442737 2708 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:02.443348 2708 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:02.443836 2708 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:02.475785 2708 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:15:02.476125 2708 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:02.476380 2708 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:15:02.476629 2708 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:15:02.512331 2708 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:39985
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:39985
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=36667
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:02.513691 2708 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:02.515378 2708 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:02.531661 2714 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:02.531720 2715 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:02.531723 2717 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:03.754890 2716 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1221 milliseconds
I20250624 02:15:03.755024 2708 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:15:03.756402 2708 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:03.759060 2708 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:03.760459 2708 hybrid_clock.cc:648] HybridClock initialized: now 1750731303760428 us; error 51 us; skew 500 ppm
I20250624 02:15:03.761348 2708 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:03.775652 2708 webserver.cc:469] Webserver started at http://127.31.42.254:36667/ using document root <none> and password file <none>
I20250624 02:15:03.776714 2708 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:03.776932 2708 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:03.785279 2708 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.006s sys 0.000s
I20250624 02:15:03.790223 2725 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:03.791435 2708 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:03.791826 2708 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "647b3bf07b6d4b56a97941e98dcfc165"
format_stamp: "Formatted at 2025-06-24 02:14:49 on dist-test-slave-5k9r"
I20250624 02:15:03.793884 2708 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:03.845158 2708 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:03.846763 2708 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:03.847239 2708 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:03.920542 2708 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:39985
I20250624 02:15:03.920648 2776 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:39985 every 8 connection(s)
I20250624 02:15:03.923475 2708 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:15:03.926584 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2708
I20250624 02:15:03.928961 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:39677
--local_ip_for_outbound_sockets=127.31.42.193
--tserver_master_addrs=127.31.42.254:39985
--webserver_port=38599
--webserver_interface=127.31.42.193
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:15:03.939538 2777 sys_catalog.cc:263] Verifying existing consensus state
I20250624 02:15:03.944648 2777 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap starting.
I20250624 02:15:03.955225 2777 log.cc:826] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:04.004781 2777 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap replayed 1/1 log segments. Stats: ops{read=18 overwritten=0 applied=18 ignored=0} inserts{seen=13 ignored=0} mutations{seen=10 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:04.005625 2777 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap complete.
I20250624 02:15:04.027494 2777 raft_consensus.cc:357] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:04.029846 2777 raft_consensus.cc:738] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Initialized, Role: FOLLOWER
I20250624 02:15:04.030773 2777 consensus_queue.cc:260] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 18, Last appended: 2.18, Last appended by leader: 18, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:04.031351 2777 raft_consensus.cc:397] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 2 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:04.031697 2777 raft_consensus.cc:491] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 2 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:04.032146 2777 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 2 FOLLOWER]: Advancing to term 3
I20250624 02:15:04.037968 2777 raft_consensus.cc:513] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:04.038657 2777 leader_election.cc:304] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 647b3bf07b6d4b56a97941e98dcfc165; no voters:
I20250624 02:15:04.040863 2777 leader_election.cc:290] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 3 election: Requested vote from peers
I20250624 02:15:04.041380 2781 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Leader election won for term 3
I20250624 02:15:04.045032 2781 raft_consensus.cc:695] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 LEADER]: Becoming Leader. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Running, Role: LEADER
I20250624 02:15:04.046006 2781 consensus_queue.cc:237] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 18, Committed index: 18, Last appended: 2.18, Last appended by leader: 18, Current term: 3, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:04.046459 2777 sys_catalog.cc:564] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:15:04.057508 2783 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 647b3bf07b6d4b56a97941e98dcfc165. Latest consensus state: current_term: 3 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:15:04.058743 2783 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:04.063292 2782 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 3 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:15:04.063928 2782 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:04.065927 2787 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:15:04.080364 2787 catalog_manager.cc:671] Loaded metadata for table TestTable [id=29250cbf4f704880818d3bca5c60b6ea]
I20250624 02:15:04.082655 2787 catalog_manager.cc:671] Loaded metadata for table TestTable2 [id=84a311cc03c74208a3ed78497bc3a41d]
I20250624 02:15:04.084461 2787 catalog_manager.cc:671] Loaded metadata for table TestTable1 [id=d6ec419af58945df89f9937c2a8ef823]
I20250624 02:15:04.093062 2787 tablet_loader.cc:96] loaded metadata for tablet 45341658889647099a604223ab78b6db (table TestTable [id=29250cbf4f704880818d3bca5c60b6ea])
I20250624 02:15:04.094551 2787 tablet_loader.cc:96] loaded metadata for tablet 529e58fc77b545c388e07e325ebb525a (table TestTable1 [id=d6ec419af58945df89f9937c2a8ef823])
I20250624 02:15:04.096263 2787 tablet_loader.cc:96] loaded metadata for tablet dde7659b8edb4c1ca042d33a2906f7f1 (table TestTable2 [id=84a311cc03c74208a3ed78497bc3a41d])
I20250624 02:15:04.097832 2787 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:15:04.103724 2787 catalog_manager.cc:1261] Loaded cluster ID: 5528683bbe9a41fc8b42fa74b6850b6d
I20250624 02:15:04.104128 2787 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:15:04.113540 2787 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:15:04.119586 2787 catalog_manager.cc:5966] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Loaded TSK: 0
I20250624 02:15:04.121598 2787 catalog_manager.cc:1516] Initializing in-progress tserver states...
W20250624 02:15:04.278297 2779 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:04.278828 2779 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:04.279328 2779 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:04.311928 2779 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:04.312886 2779 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:15:04.349093 2779 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:39677
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=38599
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:04.350438 2779 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:04.352090 2779 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:04.369982 2805 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:04.373095 2807 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:04.373659 2779 server_base.cc:1048] running on GCE node
W20250624 02:15:04.372480 2804 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:05.571753 2779 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:05.574728 2779 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:05.576232 2779 hybrid_clock.cc:648] HybridClock initialized: now 1750731305576107 us; error 143 us; skew 500 ppm
I20250624 02:15:05.577078 2779 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:05.584156 2779 webserver.cc:469] Webserver started at http://127.31.42.193:38599/ using document root <none> and password file <none>
I20250624 02:15:05.585112 2779 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:05.585356 2779 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:05.594087 2779 fs_manager.cc:714] Time spent opening directory manager: real 0.006s user 0.007s sys 0.001s
I20250624 02:15:05.599417 2814 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:05.600582 2779 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.002s
I20250624 02:15:05.600917 2779 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
format_stamp: "Formatted at 2025-06-24 02:14:52 on dist-test-slave-5k9r"
I20250624 02:15:05.602946 2779 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:05.653823 2779 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:05.655366 2779 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:05.655819 2779 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:05.658952 2779 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:05.665450 2821 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20250624 02:15:05.673285 2779 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250624 02:15:05.673516 2779 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.010s user 0.002s sys 0.001s
I20250624 02:15:05.673851 2779 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250624 02:15:05.678717 2779 ts_tablet_manager.cc:610] Registered 1 tablets
I20250624 02:15:05.678979 2779 ts_tablet_manager.cc:589] Time spent register tablets: real 0.005s user 0.002s sys 0.003s
I20250624 02:15:05.679405 2821 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap starting.
I20250624 02:15:05.764065 2821 log.cc:826] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:05.876070 2779 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:39677
I20250624 02:15:05.876256 2928 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:39677 every 8 connection(s)
I20250624 02:15:05.879702 2779 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:15:05.887208 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2779
I20250624 02:15:05.889093 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:35121
--local_ip_for_outbound_sockets=127.31.42.194
--tserver_master_addrs=127.31.42.254:39985
--webserver_port=39475
--webserver_interface=127.31.42.194
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:15:05.900110 2821 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=6 overwritten=0 applied=6 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:05.900899 2821 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap complete.
I20250624 02:15:05.902302 2821 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Time spent bootstrapping tablet: real 0.223s user 0.185s sys 0.031s
I20250624 02:15:05.921403 2821 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:15:05.924587 2821 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Initialized, Role: FOLLOWER
I20250624 02:15:05.925715 2821 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6, Last appended: 1.6, Last appended by leader: 6, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:15:05.926467 2821 raft_consensus.cc:397] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:05.926930 2821 raft_consensus.cc:491] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:05.927357 2821 raft_consensus.cc:3058] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:15:05.935401 2821 raft_consensus.cc:513] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:15:05.936635 2929 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:15:05.936831 2821 leader_election.cc:304] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: a419c27a5f5b4eaa819ede2e09199dc0; no voters:
I20250624 02:15:05.937191 2929 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:05.938535 2929 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:15:05.943637 2742 ts_manager.cc:194] Registered new tserver with Master: a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
I20250624 02:15:05.952437 2742 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:49397
I20250624 02:15:05.954524 2821 leader_election.cc:290] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 2 election: Requested vote from peers
I20250624 02:15:05.954977 2934 raft_consensus.cc:2802] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:15:05.964138 2934 raft_consensus.cc:695] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEADER]: Becoming Leader. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Running, Role: LEADER
I20250624 02:15:05.964337 2929 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
I20250624 02:15:05.965873 2934 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6, Committed index: 6, Last appended: 1.6, Last appended by leader: 6, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } }
I20250624 02:15:05.975157 2821 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Time spent starting tablet: real 0.073s user 0.042s sys 0.025s
I20250624 02:15:05.986203 2742 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: term changed from 0 to 2, leader changed from <none> to a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193), VOTER a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) added. New cstate: current_term: 2 leader_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } }
W20250624 02:15:06.039700 2742 catalog_manager.cc:5260] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 45341658889647099a604223ab78b6db with cas_config_opid_index -1: no extra replica candidate found for tablet 45341658889647099a604223ab78b6db (table TestTable [id=29250cbf4f704880818d3bca5c60b6ea]): Not found: could not select location for extra replica: not enough tablet servers to satisfy replica placement policy: the total number of registered tablet servers (1) does not allow for adding an extra replica; consider bringing up more to have at least 4 tablet servers up and running
W20250624 02:15:06.282194 2933 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:06.282686 2933 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:06.283162 2933 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:06.314904 2933 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:06.315725 2933 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:15:06.351742 2933 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:35121
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=39475
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:06.352962 2933 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:06.354665 2933 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:06.372308 2949 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:06.372344 2948 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:06.372754 2951 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:06.374994 2933 server_base.cc:1048] running on GCE node
I20250624 02:15:07.556564 2933 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:07.559387 2933 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:07.560871 2933 hybrid_clock.cc:648] HybridClock initialized: now 1750731307560847 us; error 49 us; skew 500 ppm
I20250624 02:15:07.561687 2933 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:07.568573 2933 webserver.cc:469] Webserver started at http://127.31.42.194:39475/ using document root <none> and password file <none>
I20250624 02:15:07.569504 2933 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:07.569717 2933 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:07.577822 2933 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.007s sys 0.000s
I20250624 02:15:07.582567 2958 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:07.583647 2933 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.002s
I20250624 02:15:07.583973 2933 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "6431b2b3e46e4e199cfb1609d7c42607"
format_stamp: "Formatted at 2025-06-24 02:14:53 on dist-test-slave-5k9r"
I20250624 02:15:07.585881 2933 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:07.638139 2933 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:07.639612 2933 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:07.640066 2933 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:07.642709 2933 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:07.648499 2965 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20250624 02:15:07.656143 2933 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250624 02:15:07.656391 2933 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.009s user 0.001s sys 0.001s
I20250624 02:15:07.656728 2933 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250624 02:15:07.661749 2933 ts_tablet_manager.cc:610] Registered 1 tablets
I20250624 02:15:07.662021 2933 ts_tablet_manager.cc:589] Time spent register tablets: real 0.005s user 0.002s sys 0.003s
I20250624 02:15:07.662391 2965 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:15:07.718490 2965 log.cc:826] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:07.797628 2965 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap replayed 1/1 log segments. Stats: ops{read=6 overwritten=0 applied=6 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:07.798460 2965 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap complete.
I20250624 02:15:07.800148 2965 ts_tablet_manager.cc:1397] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.138s user 0.089s sys 0.045s
I20250624 02:15:07.817226 2965 raft_consensus.cc:357] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:07.819350 2965 raft_consensus.cc:738] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: FOLLOWER
I20250624 02:15:07.820212 2965 consensus_queue.cc:260] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 6, Last appended: 1.6, Last appended by leader: 6, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:07.821004 2965 raft_consensus.cc:397] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:07.821398 2965 raft_consensus.cc:491] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:07.821764 2965 raft_consensus.cc:3058] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:15:07.828969 2965 raft_consensus.cc:513] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:07.829751 2965 leader_election.cc:304] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607; no voters:
I20250624 02:15:07.832613 2965 leader_election.cc:290] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 2 election: Requested vote from peers
I20250624 02:15:07.833981 3061 raft_consensus.cc:2802] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:15:07.837891 3061 raft_consensus.cc:695] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEADER]: Becoming Leader. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Running, Role: LEADER
I20250624 02:15:07.839200 3061 consensus_queue.cc:237] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 6, Committed index: 6, Last appended: 1.6, Last appended by leader: 6, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:07.841614 2965 ts_tablet_manager.cc:1428] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.041s user 0.033s sys 0.008s
I20250624 02:15:07.856735 2933 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:35121
I20250624 02:15:07.857272 3077 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:35121 every 8 connection(s)
I20250624 02:15:07.859241 2933 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:15:07.862780 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 2933
I20250624 02:15:07.864773 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:42985
--local_ip_for_outbound_sockets=127.31.42.195
--tserver_master_addrs=127.31.42.254:39985
--webserver_port=39881
--webserver_interface=127.31.42.195
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:15:07.877609 3078 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:15:07.878234 3078 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:07.879628 3078 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:15:07.884249 2742 ts_manager.cc:194] Registered new tserver with Master: 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121)
I20250624 02:15:07.885653 2742 catalog_manager.cc:5582] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: term changed from 1 to 2. New cstate: current_term: 2 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } health_report { overall_health: HEALTHY } } }
I20250624 02:15:07.898438 2742 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:38899
I20250624 02:15:07.902436 3078 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
W20250624 02:15:08.188608 3082 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:08.189095 3082 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:08.189538 3082 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:08.221199 3082 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:08.222050 3082 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:15:08.257265 3082 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:42985
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=39881
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:08.258491 3082 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:08.260104 3082 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:08.275631 3093 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:08.355101 2884 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7, Committed index: 7, Last appended: 2.7, Last appended by leader: 6, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:08.362558 3103 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEADER]: Committing config change with OpId 2.8: config changed from index -1 to 8, NON_VOTER 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) added. New config: { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } } }
I20250624 02:15:08.376945 2727 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 45341658889647099a604223ab78b6db with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 8)
I20250624 02:15:08.384845 2742 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: config changed from index -1 to 8, NON_VOTER 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) added. New cstate: current_term: 2 leader_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" committed_config { opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20250624 02:15:08.397336 2817 consensus_peers.cc:487] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 -> Peer 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121): Couldn't send request to peer 6431b2b3e46e4e199cfb1609d7c42607. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 45341658889647099a604223ab78b6db. This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:08.920617 3107 ts_tablet_manager.cc:927] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Initiating tablet copy from peer a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
I20250624 02:15:08.923497 3107 tablet_copy_client.cc:323] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Beginning tablet copy session from remote peer at address 127.31.42.193:39677
I20250624 02:15:08.951658 2904 tablet_copy_service.cc:140] P a419c27a5f5b4eaa819ede2e09199dc0: Received BeginTabletCopySession request for tablet 45341658889647099a604223ab78b6db from peer 6431b2b3e46e4e199cfb1609d7c42607 ({username='slave'} at 127.31.42.194:37515)
I20250624 02:15:08.952598 2904 tablet_copy_service.cc:161] P a419c27a5f5b4eaa819ede2e09199dc0: Beginning new tablet copy session on tablet 45341658889647099a604223ab78b6db from peer 6431b2b3e46e4e199cfb1609d7c42607 at {username='slave'} at 127.31.42.194:37515: session id = 6431b2b3e46e4e199cfb1609d7c42607-45341658889647099a604223ab78b6db
I20250624 02:15:08.963793 2904 tablet_copy_source_session.cc:215] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 02:15:08.969116 3107 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 45341658889647099a604223ab78b6db. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:08.979712 3107 tablet_copy_client.cc:806] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Starting download of 0 data blocks...
I20250624 02:15:08.980202 3107 tablet_copy_client.cc:670] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Starting download of 1 WAL segments...
I20250624 02:15:08.986538 3107 tablet_copy_client.cc:538] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 02:15:08.992821 3107 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:15:09.065176 3107 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap replayed 1/1 log segments. Stats: ops{read=8 overwritten=0 applied=8 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:09.065812 3107 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap complete.
I20250624 02:15:09.066354 3107 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.074s user 0.060s sys 0.016s
I20250624 02:15:09.068006 3107 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:09.068399 3107 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: LEARNER
I20250624 02:15:09.068825 3107 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 8, Last appended: 2.8, Last appended by leader: 8, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 8 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:09.071494 3107 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.005s user 0.000s sys 0.004s
I20250624 02:15:09.073765 2904 tablet_copy_service.cc:342] P a419c27a5f5b4eaa819ede2e09199dc0: Request end of tablet copy session 6431b2b3e46e4e199cfb1609d7c42607-45341658889647099a604223ab78b6db received from {username='slave'} at 127.31.42.194:37515
I20250624 02:15:09.074170 2904 tablet_copy_service.cc:434] P a419c27a5f5b4eaa819ede2e09199dc0: ending tablet copy session 6431b2b3e46e4e199cfb1609d7c42607-45341658889647099a604223ab78b6db on tablet 45341658889647099a604223ab78b6db with peer 6431b2b3e46e4e199cfb1609d7c42607
I20250624 02:15:09.345888 3028 raft_consensus.cc:1215] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Deduplicated request from leader. Original: 2.7->[2.8-2.8] Dedup: 2.8->[]
W20250624 02:15:08.276350 3094 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:08.276628 3096 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:08.277595 3082 server_base.cc:1048] running on GCE node
I20250624 02:15:09.482947 3082 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:09.485191 3082 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:09.486560 3082 hybrid_clock.cc:648] HybridClock initialized: now 1750731309486518 us; error 53 us; skew 500 ppm
I20250624 02:15:09.487360 3082 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:09.498099 3082 webserver.cc:469] Webserver started at http://127.31.42.195:39881/ using document root <none> and password file <none>
I20250624 02:15:09.499063 3082 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:09.499306 3082 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:09.507228 3082 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.003s sys 0.001s
I20250624 02:15:09.511821 3119 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:09.512858 3082 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:09.513170 3082 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "5a801f07eaeb46368d7fc28f51b9e1d6"
format_stamp: "Formatted at 2025-06-24 02:14:55 on dist-test-slave-5k9r"
I20250624 02:15:09.515139 3082 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:09.576180 3082 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:09.577646 3082 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:09.578125 3082 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:09.580857 3082 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:09.586671 3126 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20250624 02:15:09.594440 3082 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250624 02:15:09.594661 3082 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.009s user 0.002s sys 0.000s
I20250624 02:15:09.594918 3082 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250624 02:15:09.599586 3082 ts_tablet_manager.cc:610] Registered 1 tablets
I20250624 02:15:09.599776 3082 ts_tablet_manager.cc:589] Time spent register tablets: real 0.005s user 0.004s sys 0.000s
I20250624 02:15:09.600162 3126 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap starting.
I20250624 02:15:09.655931 3126 log.cc:826] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:09.743599 3126 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap replayed 1/1 log segments. Stats: ops{read=7 overwritten=0 applied=7 ignored=0} inserts{seen=300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:09.744473 3126 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap complete.
I20250624 02:15:09.746253 3126 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent bootstrapping tablet: real 0.146s user 0.097s sys 0.048s
I20250624 02:15:09.762302 3126 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:15:09.764935 3126 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Initialized, Role: FOLLOWER
I20250624 02:15:09.765753 3126 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7, Last appended: 1.7, Last appended by leader: 7, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:15:09.766475 3126 raft_consensus.cc:397] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:09.766808 3126 raft_consensus.cc:491] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:09.767086 3126 raft_consensus.cc:3058] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:15:09.774336 3126 raft_consensus.cc:513] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:15:09.775192 3126 leader_election.cc:304] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 5a801f07eaeb46368d7fc28f51b9e1d6; no voters:
I20250624 02:15:09.778187 3126 leader_election.cc:290] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [CANDIDATE]: Term 2 election: Requested vote from peers
I20250624 02:15:09.778589 3228 raft_consensus.cc:2802] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:15:09.783633 3126 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent starting tablet: real 0.037s user 0.027s sys 0.008s
I20250624 02:15:09.784938 3228 raft_consensus.cc:695] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEADER]: Becoming Leader. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Running, Role: LEADER
I20250624 02:15:09.786077 3228 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7, Committed index: 7, Last appended: 1.7, Last appended by leader: 7, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } }
I20250624 02:15:09.788818 3082 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:42985
I20250624 02:15:09.789650 3236 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:42985 every 8 connection(s)
I20250624 02:15:09.792163 3082 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:15:09.793666 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 3082
I20250624 02:15:09.820266 3239 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:15:09.820719 3239 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:09.821822 3239 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:15:09.825090 2742 ts_manager.cc:194] Registered new tserver with Master: 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:15:09.826078 2742 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 reported cstate change: term changed from 0 to 2, leader changed from <none> to 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195), VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) added. New cstate: current_term: 2 leader_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } health_report { overall_health: HEALTHY } } }
I20250624 02:15:09.837103 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:15:09.839054 2742 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:42503
I20250624 02:15:09.843323 3239 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
I20250624 02:15:09.844607 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
W20250624 02:15:09.848680 31915 ts_itest-base.cc:209] found only 2 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER } interned_replicas { ts_info_idx: 1 role: LEARNER }
I20250624 02:15:09.854640 3189 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 8, Committed index: 8, Last appended: 2.8, Last appended by leader: 7, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } }
I20250624 02:15:09.857833 3228 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEADER]: Committing config change with OpId 2.9: config changed from index -1 to 9, NON_VOTER a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) added. New config: { opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } }
I20250624 02:15:09.867563 2727 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 529e58fc77b545c388e07e325ebb525a with cas_config_opid_index -1: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250624 02:15:09.869771 2742 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 reported cstate change: config changed from index -1 to 9, NON_VOTER a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) added. New cstate: current_term: 2 leader_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" committed_config { opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20250624 02:15:09.870832 3121 consensus_peers.cc:487] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 -> Peer a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677): Couldn't send request to peer a419c27a5f5b4eaa819ede2e09199dc0. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 529e58fc77b545c388e07e325ebb525a. This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:09.880435 3189 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 9, Committed index: 9, Last appended: 2.9, Last appended by leader: 7, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:09.883973 3232 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEADER]: Committing config change with OpId 2.10: config changed from index 9 to 10, NON_VOTER 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) added. New config: { opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } } }
W20250624 02:15:09.886616 3121 consensus_peers.cc:487] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 -> Peer a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677): Couldn't send request to peer a419c27a5f5b4eaa819ede2e09199dc0. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 529e58fc77b545c388e07e325ebb525a. This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:09.893992 2727 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 529e58fc77b545c388e07e325ebb525a with cas_config_opid_index 9: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250624 02:15:09.902756 2742 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 reported cstate change: config changed from index 9 to 10, NON_VOTER 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) added. New cstate: current_term: 2 leader_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" committed_config { opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
W20250624 02:15:09.908012 3122 consensus_peers.cc:487] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 -> Peer 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121): Couldn't send request to peer 6431b2b3e46e4e199cfb1609d7c42607. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 529e58fc77b545c388e07e325ebb525a. This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:09.948613 3253 raft_consensus.cc:1062] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: attempting to promote NON_VOTER 6431b2b3e46e4e199cfb1609d7c42607 to VOTER
I20250624 02:15:09.950079 3253 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 8, Committed index: 8, Last appended: 2.8, Last appended by leader: 6, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:09.954700 3028 raft_consensus.cc:1273] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Refusing update from remote peer a419c27a5f5b4eaa819ede2e09199dc0: Log matching property violated. Preceding OpId in replica: term: 2 index: 8. Preceding OpId from leader: term: 2 index: 9. (index mismatch)
I20250624 02:15:09.955989 3254 consensus_queue.cc:1035] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 9, Last known committed idx: 8, Time since last communication: 0.001s
I20250624 02:15:09.962510 3253 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEADER]: Committing config change with OpId 2.9: config changed from index 8 to 9, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New config: { opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:09.964056 3028 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Committing config change with OpId 2.9: config changed from index 8 to 9, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New config: { opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:09.974378 2742 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: config changed from index 8 to 9, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New cstate: current_term: 2 leader_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" committed_config { opid_index: 9 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } }
I20250624 02:15:09.984239 2884 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 9, Committed index: 9, Last appended: 2.9, Last appended by leader: 6, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: true } }
I20250624 02:15:09.988672 3028 raft_consensus.cc:1273] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Refusing update from remote peer a419c27a5f5b4eaa819ede2e09199dc0: Log matching property violated. Preceding OpId in replica: term: 2 index: 9. Preceding OpId from leader: term: 2 index: 10. (index mismatch)
I20250624 02:15:09.989753 3253 consensus_queue.cc:1035] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 10, Last known committed idx: 9, Time since last communication: 0.000s
I20250624 02:15:09.994930 3257 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEADER]: Committing config change with OpId 2.10: config changed from index 9 to 10, NON_VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) added. New config: { opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: true } } }
I20250624 02:15:09.996471 3028 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Committing config change with OpId 2.10: config changed from index 9 to 10, NON_VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) added. New config: { opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: true } } }
W20250624 02:15:09.998593 2816 consensus_peers.cc:487] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 -> Peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Couldn't send request to peer 5a801f07eaeb46368d7fc28f51b9e1d6. Error code: TABLET_NOT_FOUND (6). Status: Not found: Tablet not found: 45341658889647099a604223ab78b6db. This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:10.001753 2727 catalog_manager.cc:5095] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 45341658889647099a604223ab78b6db with cas_config_opid_index 9: ChangeConfig:ADD_PEER:NON_VOTER succeeded (attempt 1)
I20250624 02:15:10.004895 2742 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: config changed from index 9 to 10, NON_VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) added. New cstate: current_term: 2 leader_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" committed_config { opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: true } health_report { overall_health: UNKNOWN } } }
I20250624 02:15:10.375527 3260 ts_tablet_manager.cc:927] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Initiating tablet copy from peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:15:10.376845 3260 tablet_copy_client.cc:323] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Beginning tablet copy session from remote peer at address 127.31.42.195:42985
I20250624 02:15:10.386153 3209 tablet_copy_service.cc:140] P 5a801f07eaeb46368d7fc28f51b9e1d6: Received BeginTabletCopySession request for tablet 529e58fc77b545c388e07e325ebb525a from peer 6431b2b3e46e4e199cfb1609d7c42607 ({username='slave'} at 127.31.42.194:34933)
I20250624 02:15:10.386605 3209 tablet_copy_service.cc:161] P 5a801f07eaeb46368d7fc28f51b9e1d6: Beginning new tablet copy session on tablet 529e58fc77b545c388e07e325ebb525a from peer 6431b2b3e46e4e199cfb1609d7c42607 at {username='slave'} at 127.31.42.194:34933: session id = 6431b2b3e46e4e199cfb1609d7c42607-529e58fc77b545c388e07e325ebb525a
I20250624 02:15:10.391391 3209 tablet_copy_source_session.cc:215] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 02:15:10.394567 3260 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 529e58fc77b545c388e07e325ebb525a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:10.404652 3260 tablet_copy_client.cc:806] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Starting download of 0 data blocks...
I20250624 02:15:10.405144 3260 tablet_copy_client.cc:670] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Starting download of 1 WAL segments...
I20250624 02:15:10.408629 3260 tablet_copy_client.cc:538] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 02:15:10.414433 3260 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:15:10.443375 3264 ts_tablet_manager.cc:927] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Initiating tablet copy from peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:15:10.445488 3264 tablet_copy_client.cc:323] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: tablet copy: Beginning tablet copy session from remote peer at address 127.31.42.195:42985
I20250624 02:15:10.446991 3209 tablet_copy_service.cc:140] P 5a801f07eaeb46368d7fc28f51b9e1d6: Received BeginTabletCopySession request for tablet 529e58fc77b545c388e07e325ebb525a from peer a419c27a5f5b4eaa819ede2e09199dc0 ({username='slave'} at 127.31.42.193:41551)
I20250624 02:15:10.447364 3209 tablet_copy_service.cc:161] P 5a801f07eaeb46368d7fc28f51b9e1d6: Beginning new tablet copy session on tablet 529e58fc77b545c388e07e325ebb525a from peer a419c27a5f5b4eaa819ede2e09199dc0 at {username='slave'} at 127.31.42.193:41551: session id = a419c27a5f5b4eaa819ede2e09199dc0-529e58fc77b545c388e07e325ebb525a
I20250624 02:15:10.451649 3209 tablet_copy_source_session.cc:215] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 02:15:10.454244 3264 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 529e58fc77b545c388e07e325ebb525a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:10.464264 3264 tablet_copy_client.cc:806] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: tablet copy: Starting download of 0 data blocks...
I20250624 02:15:10.464774 3264 tablet_copy_client.cc:670] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: tablet copy: Starting download of 1 WAL segments...
I20250624 02:15:10.468282 3264 tablet_copy_client.cc:538] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 02:15:10.473757 3264 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap starting.
I20250624 02:15:10.498435 3260 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap replayed 1/1 log segments. Stats: ops{read=10 overwritten=0 applied=10 ignored=0} inserts{seen=300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:10.499044 3260 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap complete.
I20250624 02:15:10.499480 3260 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.085s user 0.068s sys 0.016s
I20250624 02:15:10.501008 3260 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:10.501439 3260 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: LEARNER
I20250624 02:15:10.501855 3260 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 10, Last appended: 2.10, Last appended by leader: 10, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:10.503510 3260 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:10.505069 3209 tablet_copy_service.cc:342] P 5a801f07eaeb46368d7fc28f51b9e1d6: Request end of tablet copy session 6431b2b3e46e4e199cfb1609d7c42607-529e58fc77b545c388e07e325ebb525a received from {username='slave'} at 127.31.42.194:34933
I20250624 02:15:10.505450 3209 tablet_copy_service.cc:434] P 5a801f07eaeb46368d7fc28f51b9e1d6: ending tablet copy session 6431b2b3e46e4e199cfb1609d7c42607-529e58fc77b545c388e07e325ebb525a on tablet 529e58fc77b545c388e07e325ebb525a with peer 6431b2b3e46e4e199cfb1609d7c42607
I20250624 02:15:10.560333 3264 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=10 overwritten=0 applied=10 ignored=0} inserts{seen=300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:10.560969 3264 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap complete.
I20250624 02:15:10.561404 3264 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Time spent bootstrapping tablet: real 0.088s user 0.083s sys 0.004s
I20250624 02:15:10.563015 3264 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:10.563477 3264 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEARNER]: Becoming Follower/Learner. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Initialized, Role: LEARNER
I20250624 02:15:10.563910 3264 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 10, Last appended: 2.10, Last appended by leader: 10, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: NON_VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: true } }
I20250624 02:15:10.565274 3264 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Time spent starting tablet: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:10.566640 3209 tablet_copy_service.cc:342] P 5a801f07eaeb46368d7fc28f51b9e1d6: Request end of tablet copy session a419c27a5f5b4eaa819ede2e09199dc0-529e58fc77b545c388e07e325ebb525a received from {username='slave'} at 127.31.42.193:41551
I20250624 02:15:10.566942 3209 tablet_copy_service.cc:434] P 5a801f07eaeb46368d7fc28f51b9e1d6: ending tablet copy session a419c27a5f5b4eaa819ede2e09199dc0-529e58fc77b545c388e07e325ebb525a on tablet 529e58fc77b545c388e07e325ebb525a with peer a419c27a5f5b4eaa819ede2e09199dc0
I20250624 02:15:10.579921 3267 ts_tablet_manager.cc:927] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Initiating tablet copy from peer a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
I20250624 02:15:10.581758 3267 tablet_copy_client.cc:323] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: tablet copy: Beginning tablet copy session from remote peer at address 127.31.42.193:39677
I20250624 02:15:10.583186 2904 tablet_copy_service.cc:140] P a419c27a5f5b4eaa819ede2e09199dc0: Received BeginTabletCopySession request for tablet 45341658889647099a604223ab78b6db from peer 5a801f07eaeb46368d7fc28f51b9e1d6 ({username='slave'} at 127.31.42.195:40033)
I20250624 02:15:10.583572 2904 tablet_copy_service.cc:161] P a419c27a5f5b4eaa819ede2e09199dc0: Beginning new tablet copy session on tablet 45341658889647099a604223ab78b6db from peer 5a801f07eaeb46368d7fc28f51b9e1d6 at {username='slave'} at 127.31.42.195:40033: session id = 5a801f07eaeb46368d7fc28f51b9e1d6-45341658889647099a604223ab78b6db
I20250624 02:15:10.587793 2904 tablet_copy_source_session.cc:215] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Tablet Copy: opened 0 blocks and 1 log segments
I20250624 02:15:10.590440 3267 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 45341658889647099a604223ab78b6db. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:10.602726 3267 tablet_copy_client.cc:806] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: tablet copy: Starting download of 0 data blocks...
I20250624 02:15:10.603302 3267 tablet_copy_client.cc:670] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: tablet copy: Starting download of 1 WAL segments...
I20250624 02:15:10.606678 3267 tablet_copy_client.cc:538] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: tablet copy: Tablet Copy complete. Replacing tablet superblock.
I20250624 02:15:10.612138 3267 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap starting.
I20250624 02:15:10.612138 2728 catalog_manager.cc:5129] ChangeConfig:ADD_PEER:NON_VOTER RPC for tablet 45341658889647099a604223ab78b6db with cas_config_opid_index 8: aborting the task: latest config opid_index 10; task opid_index 8
I20250624 02:15:10.684121 3267 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap replayed 1/1 log segments. Stats: ops{read=10 overwritten=0 applied=10 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:10.684702 3267 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap complete.
I20250624 02:15:10.685106 3267 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent bootstrapping tablet: real 0.073s user 0.067s sys 0.004s
I20250624 02:15:10.686776 3267 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEARNER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: true } }
I20250624 02:15:10.687242 3267 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEARNER]: Becoming Follower/Learner. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Initialized, Role: LEARNER
I20250624 02:15:10.687623 3267 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 10, Last appended: 2.10, Last appended by leader: 10, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 10 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: NON_VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: true } }
I20250624 02:15:10.688917 3267 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent starting tablet: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:10.690411 2904 tablet_copy_service.cc:342] P a419c27a5f5b4eaa819ede2e09199dc0: Request end of tablet copy session 5a801f07eaeb46368d7fc28f51b9e1d6-45341658889647099a604223ab78b6db received from {username='slave'} at 127.31.42.195:40033
I20250624 02:15:10.690832 2904 tablet_copy_service.cc:434] P a419c27a5f5b4eaa819ede2e09199dc0: ending tablet copy session 5a801f07eaeb46368d7fc28f51b9e1d6-45341658889647099a604223ab78b6db on tablet 45341658889647099a604223ab78b6db with peer 5a801f07eaeb46368d7fc28f51b9e1d6
I20250624 02:15:10.852919 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver a419c27a5f5b4eaa819ede2e09199dc0 to finish bootstrapping
I20250624 02:15:10.857075 3028 raft_consensus.cc:1215] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Deduplicated request from leader. Original: 2.9->[2.10-2.10] Dedup: 2.10->[]
I20250624 02:15:10.874789 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 6431b2b3e46e4e199cfb1609d7c42607 to finish bootstrapping
I20250624 02:15:10.891749 2884 raft_consensus.cc:1215] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEARNER]: Deduplicated request from leader. Original: 2.9->[2.10-2.10] Dedup: 2.10->[]
I20250624 02:15:10.895992 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 5a801f07eaeb46368d7fc28f51b9e1d6 to finish bootstrapping
I20250624 02:15:11.120108 3189 raft_consensus.cc:1215] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEARNER]: Deduplicated request from leader. Original: 2.9->[2.10-2.10] Dedup: 2.10->[]
I20250624 02:15:11.202962 3169 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250624 02:15:11.205379 3008 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250624 02:15:11.211731 2864 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250624 02:15:11.349169 3232 raft_consensus.cc:1062] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: attempting to promote NON_VOTER 6431b2b3e46e4e199cfb1609d7c42607 to VOTER
I20250624 02:15:11.351011 3232 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 10, Committed index: 10, Last appended: 2.10, Last appended by leader: 7, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:11.355844 3028 raft_consensus.cc:1273] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 LEARNER]: Refusing update from remote peer 5a801f07eaeb46368d7fc28f51b9e1d6: Log matching property violated. Preceding OpId in replica: term: 2 index: 10. Preceding OpId from leader: term: 2 index: 11. (index mismatch)
I20250624 02:15:11.356341 2884 raft_consensus.cc:1273] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEARNER]: Refusing update from remote peer 5a801f07eaeb46368d7fc28f51b9e1d6: Log matching property violated. Preceding OpId in replica: term: 2 index: 10. Preceding OpId from leader: term: 2 index: 11. (index mismatch)
I20250624 02:15:11.357205 3232 consensus_queue.cc:1035] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11, Last known committed idx: 10, Time since last communication: 0.000s
I20250624 02:15:11.358114 3270 consensus_queue.cc:1035] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11, Last known committed idx: 10, Time since last communication: 0.001s
I20250624 02:15:11.366442 3232 raft_consensus.cc:1025] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEADER]: attempt to promote peer a419c27a5f5b4eaa819ede2e09199dc0: there is already a config change operation in progress. Unable to promote follower until it completes. Doing nothing.
I20250624 02:15:11.368137 3271 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEADER]: Committing config change with OpId 2.11: config changed from index 10 to 11, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New config: { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.369861 3028 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Committing config change with OpId 2.11: config changed from index 10 to 11, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New config: { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.382905 2884 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEARNER]: Committing config change with OpId 2.11: config changed from index 10 to 11, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New config: { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.397512 2741 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: config changed from index 10 to 11, 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) changed from NON_VOTER to VOTER. New cstate: current_term: 2 leader_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" committed_config { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: NON_VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: true } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.403115 3232 raft_consensus.cc:1062] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: attempting to promote NON_VOTER a419c27a5f5b4eaa819ede2e09199dc0 to VOTER
I20250624 02:15:11.405164 3232 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11, Committed index: 11, Last appended: 2.11, Last appended by leader: 7, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:11.413426 3028 raft_consensus.cc:1273] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Refusing update from remote peer 5a801f07eaeb46368d7fc28f51b9e1d6: Log matching property violated. Preceding OpId in replica: term: 2 index: 11. Preceding OpId from leader: term: 2 index: 12. (index mismatch)
I20250624 02:15:11.413568 2883 raft_consensus.cc:1273] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEARNER]: Refusing update from remote peer 5a801f07eaeb46368d7fc28f51b9e1d6: Log matching property violated. Preceding OpId in replica: term: 2 index: 11. Preceding OpId from leader: term: 2 index: 12. (index mismatch)
I20250624 02:15:11.414927 3232 consensus_queue.cc:1035] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 12, Last known committed idx: 11, Time since last communication: 0.000s
I20250624 02:15:11.416142 3270 consensus_queue.cc:1035] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 12, Last known committed idx: 11, Time since last communication: 0.000s
I20250624 02:15:11.427947 2882 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Committing config change with OpId 2.12: config changed from index 11 to 12, a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) changed from NON_VOTER to VOTER. New config: { opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.425596 3270 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEADER]: Committing config change with OpId 2.12: config changed from index 11 to 12, a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) changed from NON_VOTER to VOTER. New config: { opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.449842 3028 raft_consensus.cc:2953] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Committing config change with OpId 2.12: config changed from index 11 to 12, a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) changed from NON_VOTER to VOTER. New config: { opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:11.456424 2742 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 reported cstate change: config changed from index 11 to 12, a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) changed from NON_VOTER to VOTER. New cstate: current_term: 2 leader_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" committed_config { opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: UNKNOWN } } }
Master Summary
UUID | Address | Status
----------------------------------+---------------------+---------
647b3bf07b6d4b56a97941e98dcfc165 | 127.31.42.254:39985 | HEALTHY
Unusual flags for Master:
Flag | Value | Tags | Master
----------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------+-------------------------
ipki_ca_key_size | 768 | experimental | all 1 server(s) checked
ipki_server_key_size | 768 | experimental | all 1 server(s) checked
never_fsync | true | unsafe,advanced | all 1 server(s) checked
openssl_security_level_override | 0 | unsafe,hidden | all 1 server(s) checked
rpc_reuseport | true | experimental | all 1 server(s) checked
rpc_server_allow_ephemeral_ports | true | unsafe | all 1 server(s) checked
server_dump_info_format | pb | hidden | all 1 server(s) checked
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb | hidden | all 1 server(s) checked
tsk_num_rsa_bits | 512 | experimental | all 1 server(s) checked
Flags of checked categories for Master:
Flag | Value | Master
---------------------+---------------------+-------------------------
builtin_ntp_servers | 127.31.42.212:37039 | all 1 server(s) checked
time_source | builtin | all 1 server(s) checked
Tablet Server Summary
UUID | Address | Status | Location | Tablet Leaders | Active Scanners
----------------------------------+---------------------+---------+----------+----------------+-----------------
5a801f07eaeb46368d7fc28f51b9e1d6 | 127.31.42.195:42985 | HEALTHY | <none> | 1 | 0
6431b2b3e46e4e199cfb1609d7c42607 | 127.31.42.194:35121 | HEALTHY | <none> | 1 | 0
a419c27a5f5b4eaa819ede2e09199dc0 | 127.31.42.193:39677 | HEALTHY | <none> | 1 | 0
Tablet Server Location Summary
Location | Count
----------+---------
<none> | 3
Unusual flags for Tablet Server:
Flag | Value | Tags | Tablet Server
----------------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------+-------------------------
ipki_server_key_size | 768 | experimental | all 3 server(s) checked
local_ip_for_outbound_sockets | 127.31.42.193 | experimental | 127.31.42.193:39677
local_ip_for_outbound_sockets | 127.31.42.194 | experimental | 127.31.42.194:35121
local_ip_for_outbound_sockets | 127.31.42.195 | experimental | 127.31.42.195:42985
never_fsync | true | unsafe,advanced | all 3 server(s) checked
openssl_security_level_override | 0 | unsafe,hidden | all 3 server(s) checked
rpc_server_allow_ephemeral_ports | true | unsafe | all 3 server(s) checked
server_dump_info_format | pb | hidden | all 3 server(s) checked
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb | hidden | 127.31.42.193:39677
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb | hidden | 127.31.42.194:35121
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb | hidden | 127.31.42.195:42985
Flags of checked categories for Tablet Server:
Flag | Value | Tablet Server
---------------------+---------------------+-------------------------
builtin_ntp_servers | 127.31.42.212:37039 | all 3 server(s) checked
time_source | builtin | all 3 server(s) checked
Version Summary
Version | Servers
-----------------+-------------------------
1.18.0-SNAPSHOT | all 4 server(s) checked
Tablet Summary
The cluster doesn't have any matching system tables
Summary by table
Name | RF | Status | Total Tablets | Healthy | Recovering | Under-replicated | Unavailable
------------+----+---------+---------------+---------+------------+------------------+-------------
TestTable | 3 | HEALTHY | 1 | 1 | 0 | 0 | 0
TestTable1 | 3 | HEALTHY | 1 | 1 | 0 | 0 | 0
TestTable2 | 1 | HEALTHY | 1 | 1 | 0 | 0 | 0
Tablet Replica Count Summary
Statistic | Replica Count
----------------+---------------
Minimum | 2
First Quartile | 2
Median | 2
Third Quartile | 3
Maximum | 3
Total Count Summary
| Total Count
----------------+-------------
Masters | 1
Tablet Servers | 3
Tables | 3
Tablets | 3
Replicas | 7
==================
Warnings:
==================
Some masters have unsafe, experimental, or hidden flags set
Some tablet servers have unsafe, experimental, or hidden flags set
OK
I20250624 02:15:11.475550 31915 log_verifier.cc:126] Checking tablet 45341658889647099a604223ab78b6db
I20250624 02:15:11.566084 31915 log_verifier.cc:177] Verified matching terms for 10 ops in tablet 45341658889647099a604223ab78b6db
I20250624 02:15:11.566368 31915 log_verifier.cc:126] Checking tablet 529e58fc77b545c388e07e325ebb525a
I20250624 02:15:11.603502 3257 raft_consensus.cc:1062] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: attempting to promote NON_VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 to VOTER
I20250624 02:15:11.604919 3257 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 10, Committed index: 10, Last appended: 2.10, Last appended by leader: 6, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:11.609934 3028 raft_consensus.cc:1273] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Refusing update from remote peer a419c27a5f5b4eaa819ede2e09199dc0: Log matching property violated. Preceding OpId in replica: term: 2 index: 10. Preceding OpId from leader: term: 2 index: 11. (index mismatch)
I20250624 02:15:11.610289 3189 raft_consensus.cc:1273] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 LEARNER]: Refusing update from remote peer a419c27a5f5b4eaa819ede2e09199dc0: Log matching property violated. Preceding OpId in replica: term: 2 index: 10. Preceding OpId from leader: term: 2 index: 11. (index mismatch)
I20250624 02:15:11.611290 3257 consensus_queue.cc:1035] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11, Last known committed idx: 10, Time since last communication: 0.000s
I20250624 02:15:11.612267 3288 consensus_queue.cc:1035] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [LEADER]: Connected to new peer: Peer: permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 11, Last known committed idx: 10, Time since last communication: 0.000s
I20250624 02:15:11.627776 3288 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 LEADER]: Committing config change with OpId 2.11: config changed from index 10 to 11, 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) changed from NON_VOTER to VOTER. New config: { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } } }
I20250624 02:15:11.629652 3189 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Committing config change with OpId 2.11: config changed from index 10 to 11, 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) changed from NON_VOTER to VOTER. New config: { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } } }
I20250624 02:15:11.630932 3028 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Committing config change with OpId 2.11: config changed from index 10 to 11, 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) changed from NON_VOTER to VOTER. New config: { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } } }
I20250624 02:15:11.643134 2741 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: config changed from index 10 to 11, 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) changed from NON_VOTER to VOTER. New cstate: current_term: 2 leader_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" committed_config { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } health_report { overall_health: HEALTHY } } }
I20250624 02:15:11.675370 31915 log_verifier.cc:177] Verified matching terms for 12 ops in tablet 529e58fc77b545c388e07e325ebb525a
I20250624 02:15:11.675647 31915 log_verifier.cc:126] Checking tablet dde7659b8edb4c1ca042d33a2906f7f1
I20250624 02:15:11.702070 31915 log_verifier.cc:177] Verified matching terms for 7 ops in tablet dde7659b8edb4c1ca042d33a2906f7f1
I20250624 02:15:11.702464 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2708
I20250624 02:15:11.733358 31915 minidump.cc:252] Setting minidump size limit to 20M
I20250624 02:15:11.734580 31915 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:11.735563 31915 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:11.746501 3316 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:11.746665 3317 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:11.984369 3319 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:11.984817 31915 server_base.cc:1048] running on GCE node
I20250624 02:15:11.986047 31915 hybrid_clock.cc:584] initializing the hybrid clock with 'system_unsync' time source
W20250624 02:15:11.986269 31915 system_unsync_time.cc:38] NTP support is disabled. Clock error bounds will not be accurate. This configuration is not suitable for distributed clusters.
I20250624 02:15:11.986446 31915 hybrid_clock.cc:648] HybridClock initialized: now 1750731311986419 us; error 0 us; skew 500 ppm
I20250624 02:15:11.987129 31915 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:11.990583 31915 webserver.cc:469] Webserver started at http://0.0.0.0:46429/ using document root <none> and password file <none>
I20250624 02:15:11.991487 31915 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:11.991868 31915 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
W20250624 02:15:11.998409 3239 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:39985 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:39985: connect: Connection refused (error 111)
I20250624 02:15:11.999468 31915 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.002s sys 0.003s
I20250624 02:15:12.003314 3327 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:12.004352 31915 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.004s sys 0.000s
I20250624 02:15:12.004688 31915 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "647b3bf07b6d4b56a97941e98dcfc165"
format_stamp: "Formatted at 2025-06-24 02:14:49 on dist-test-slave-5k9r"
I20250624 02:15:12.006418 31915 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:12.025599 31915 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:12.027096 31915 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:12.027549 31915 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:12.036332 31915 sys_catalog.cc:263] Verifying existing consensus state
W20250624 02:15:12.039867 31915 sys_catalog.cc:243] For a single master config, on-disk Raft master: 127.31.42.254:39985 exists but no master address supplied!
I20250624 02:15:12.041878 31915 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap starting.
I20250624 02:15:12.084968 31915 log.cc:826] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:12.151286 31915 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap replayed 1/1 log segments. Stats: ops{read=30 overwritten=0 applied=30 ignored=0} inserts{seen=13 ignored=0} mutations{seen=21 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:12.152086 31915 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap complete.
I20250624 02:15:12.165386 31915 raft_consensus.cc:357] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:12.166054 31915 raft_consensus.cc:738] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Becoming Follower/Learner. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Initialized, Role: FOLLOWER
I20250624 02:15:12.166815 31915 consensus_queue.cc:260] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 30, Last appended: 3.30, Last appended by leader: 30, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:12.167304 31915 raft_consensus.cc:397] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:12.167565 31915 raft_consensus.cc:491] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:12.167887 31915 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 3 FOLLOWER]: Advancing to term 4
I20250624 02:15:12.173933 31915 raft_consensus.cc:513] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 4 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:12.174767 31915 leader_election.cc:304] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 4 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 647b3bf07b6d4b56a97941e98dcfc165; no voters:
I20250624 02:15:12.175876 31915 leader_election.cc:290] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 4 election: Requested vote from peers
I20250624 02:15:12.176146 3334 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 4 FOLLOWER]: Leader election won for term 4
I20250624 02:15:12.178025 3334 raft_consensus.cc:695] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 4 LEADER]: Becoming Leader. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Running, Role: LEADER
I20250624 02:15:12.178823 3334 consensus_queue.cc:237] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 30, Committed index: 30, Last appended: 3.30, Last appended by leader: 30, Current term: 4, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:12.187160 3336 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 647b3bf07b6d4b56a97941e98dcfc165. Latest consensus state: current_term: 4 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:15:12.187742 3336 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:12.187685 3335 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 4 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:15:12.188328 3335 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:12.216357 31915 tablet_replica.cc:331] stopping tablet replica
I20250624 02:15:12.216941 31915 raft_consensus.cc:2241] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 4 LEADER]: Raft consensus shutting down.
I20250624 02:15:12.217382 31915 raft_consensus.cc:2270] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 4 FOLLOWER]: Raft consensus is shut down!
I20250624 02:15:12.219666 31915 master.cc:561] Master@0.0.0.0:7051 shutting down...
W20250624 02:15:12.220201 31915 acceptor_pool.cc:196] Could not shut down acceptor socket on 0.0.0.0:7051: Network error: shutdown error: Transport endpoint is not connected (error 107)
I20250624 02:15:12.250471 31915 master.cc:583] Master@0.0.0.0:7051 shutdown complete.
W20250624 02:15:12.660169 3078 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:39985 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:39985: connect: Connection refused (error 111)
W20250624 02:15:12.666491 2929 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:39985 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:39985: connect: Connection refused (error 111)
I20250624 02:15:17.668380 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2779
I20250624 02:15:17.698539 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 2933
I20250624 02:15:17.727402 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 3082
I20250624 02:15:17.760224 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:39985
--webserver_interface=127.31.42.254
--webserver_port=36667
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:39985 with env {}
W20250624 02:15:18.077620 3410 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:18.078277 3410 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:18.078761 3410 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:18.111572 3410 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:15:18.111928 3410 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:18.112182 3410 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:15:18.112413 3410 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:15:18.148938 3410 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:39985
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:39985
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=36667
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:18.150336 3410 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:18.152082 3410 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:18.167682 3417 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:18.167687 3416 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:18.167969 3419 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:19.380491 3418 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1211 milliseconds
I20250624 02:15:19.380616 3410 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:15:19.381909 3410 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:19.385118 3410 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:19.386581 3410 hybrid_clock.cc:648] HybridClock initialized: now 1750731319386531 us; error 67 us; skew 500 ppm
I20250624 02:15:19.387413 3410 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:19.395092 3410 webserver.cc:469] Webserver started at http://127.31.42.254:36667/ using document root <none> and password file <none>
I20250624 02:15:19.396090 3410 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:19.396294 3410 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:19.404641 3410 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.005s sys 0.000s
I20250624 02:15:19.409737 3426 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:19.411032 3410 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 02:15:19.411350 3410 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "647b3bf07b6d4b56a97941e98dcfc165"
format_stamp: "Formatted at 2025-06-24 02:14:49 on dist-test-slave-5k9r"
I20250624 02:15:19.413395 3410 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:19.486581 3410 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:19.488351 3410 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:19.489256 3410 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:19.565902 3410 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:39985
I20250624 02:15:19.566042 3478 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:39985 every 8 connection(s)
I20250624 02:15:19.568814 3410 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:15:19.576241 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 3410
I20250624 02:15:19.577768 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:39677
--local_ip_for_outbound_sockets=127.31.42.193
--tserver_master_addrs=127.31.42.254:39985
--webserver_port=38599
--webserver_interface=127.31.42.193
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:15:19.580044 3479 sys_catalog.cc:263] Verifying existing consensus state
I20250624 02:15:19.585873 3479 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap starting.
I20250624 02:15:19.597452 3479 log.cc:826] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:19.678963 3479 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap replayed 1/1 log segments. Stats: ops{read=34 overwritten=0 applied=34 ignored=0} inserts{seen=15 ignored=0} mutations{seen=23 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:19.679806 3479 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Bootstrap complete.
I20250624 02:15:19.699254 3479 raft_consensus.cc:357] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 5 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:19.701447 3479 raft_consensus.cc:738] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 5 FOLLOWER]: Becoming Follower/Learner. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Initialized, Role: FOLLOWER
I20250624 02:15:19.702298 3479 consensus_queue.cc:260] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 34, Last appended: 5.34, Last appended by leader: 34, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:19.702852 3479 raft_consensus.cc:397] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 5 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:19.703143 3479 raft_consensus.cc:491] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 5 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:19.703449 3479 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 5 FOLLOWER]: Advancing to term 6
I20250624 02:15:19.708859 3479 raft_consensus.cc:513] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 6 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:19.709558 3479 leader_election.cc:304] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 6 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 647b3bf07b6d4b56a97941e98dcfc165; no voters:
I20250624 02:15:19.711889 3479 leader_election.cc:290] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [CANDIDATE]: Term 6 election: Requested vote from peers
I20250624 02:15:19.712390 3483 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 6 FOLLOWER]: Leader election won for term 6
I20250624 02:15:19.715822 3483 raft_consensus.cc:695] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [term 6 LEADER]: Becoming Leader. State: Replica: 647b3bf07b6d4b56a97941e98dcfc165, State: Running, Role: LEADER
I20250624 02:15:19.716820 3483 consensus_queue.cc:237] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 34, Committed index: 34, Last appended: 5.34, Last appended by leader: 34, Current term: 6, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } }
I20250624 02:15:19.717306 3479 sys_catalog.cc:564] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:15:19.730139 3484 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 6 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:15:19.730957 3484 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:19.730155 3485 sys_catalog.cc:455] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 647b3bf07b6d4b56a97941e98dcfc165. Latest consensus state: current_term: 6 leader_uuid: "647b3bf07b6d4b56a97941e98dcfc165" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "647b3bf07b6d4b56a97941e98dcfc165" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 39985 } } }
I20250624 02:15:19.733245 3485 sys_catalog.cc:458] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165 [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:19.738545 3490 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:15:19.758064 3490 catalog_manager.cc:671] Loaded metadata for table TestTable2 [id=84a311cc03c74208a3ed78497bc3a41d]
I20250624 02:15:19.760464 3490 catalog_manager.cc:671] Loaded metadata for table TestTable [id=d6070d47d8534960b0c3b407ada1dcc2]
I20250624 02:15:19.762874 3490 catalog_manager.cc:671] Loaded metadata for table TestTable1 [id=d6ec419af58945df89f9937c2a8ef823]
I20250624 02:15:19.775943 3490 tablet_loader.cc:96] loaded metadata for tablet 45341658889647099a604223ab78b6db (table TestTable [id=d6070d47d8534960b0c3b407ada1dcc2])
I20250624 02:15:19.777662 3490 tablet_loader.cc:96] loaded metadata for tablet 529e58fc77b545c388e07e325ebb525a (table TestTable1 [id=d6ec419af58945df89f9937c2a8ef823])
I20250624 02:15:19.779246 3490 tablet_loader.cc:96] loaded metadata for tablet dde7659b8edb4c1ca042d33a2906f7f1 (table TestTable2 [id=84a311cc03c74208a3ed78497bc3a41d])
I20250624 02:15:19.780786 3490 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:15:19.787367 3490 catalog_manager.cc:1261] Loaded cluster ID: 5528683bbe9a41fc8b42fa74b6850b6d
I20250624 02:15:19.787783 3490 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:15:19.800027 3490 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:15:19.806692 3490 catalog_manager.cc:5966] T 00000000000000000000000000000000 P 647b3bf07b6d4b56a97941e98dcfc165: Loaded TSK: 0
I20250624 02:15:19.809413 3490 catalog_manager.cc:1516] Initializing in-progress tserver states...
W20250624 02:15:19.937582 3481 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:19.938169 3481 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:19.938715 3481 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:19.970559 3481 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:19.971504 3481 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:15:20.008216 3481 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:39677
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=38599
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:20.010155 3481 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:20.011955 3481 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:20.030021 3508 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:21.434319 3506 debug-util.cc:398] Leaking SignalData structure 0x7b08000184e0 after lost signal to thread 3481
W20250624 02:15:20.030421 3507 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:21.877403 3506 kernel_stack_watchdog.cc:198] Thread 3481 stuck at /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/thread.cc:641 for 405ms:
Kernel stack:
(could not read kernel stack)
User stack:
<Timed out: thread did not respond: maybe it is blocking signals>
W20250624 02:15:21.878543 3509 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1850 milliseconds
W20250624 02:15:21.879061 3481 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.853s user 0.549s sys 1.177s
W20250624 02:15:21.879382 3481 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.854s user 0.549s sys 1.177s
I20250624 02:15:21.880123 3481 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
W20250624 02:15:21.880198 3510 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:21.883109 3481 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:21.885771 3481 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:21.887302 3481 hybrid_clock.cc:648] HybridClock initialized: now 1750731321887225 us; error 80 us; skew 500 ppm
I20250624 02:15:21.888171 3481 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:21.903098 3481 webserver.cc:469] Webserver started at http://127.31.42.193:38599/ using document root <none> and password file <none>
I20250624 02:15:21.904100 3481 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:21.904320 3481 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:21.912753 3481 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.006s sys 0.000s
I20250624 02:15:21.918279 3517 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:21.919647 3481 fs_manager.cc:730] Time spent opening block manager: real 0.005s user 0.004s sys 0.000s
I20250624 02:15:21.920004 3481 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
format_stamp: "Formatted at 2025-06-24 02:14:52 on dist-test-slave-5k9r"
I20250624 02:15:21.922039 3481 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:21.984054 3481 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:21.985538 3481 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:21.986008 3481 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:21.988857 3481 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:21.995647 3524 ts_tablet_manager.cc:542] Loading tablet metadata (0/2 complete)
I20250624 02:15:22.014386 3481 ts_tablet_manager.cc:579] Loaded tablet metadata (2 total tablets, 2 live tablets)
I20250624 02:15:22.014642 3481 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.021s user 0.002s sys 0.000s
I20250624 02:15:22.014951 3481 ts_tablet_manager.cc:594] Registering tablets (0/2 complete)
I20250624 02:15:22.020608 3524 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap starting.
I20250624 02:15:22.023301 3481 ts_tablet_manager.cc:610] Registered 2 tablets
I20250624 02:15:22.023530 3481 ts_tablet_manager.cc:589] Time spent register tablets: real 0.009s user 0.008s sys 0.000s
I20250624 02:15:22.084508 3524 log.cc:826] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:22.221380 3481 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:39677
I20250624 02:15:22.221527 3631 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:39677 every 8 connection(s)
I20250624 02:15:22.225029 3481 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:15:22.226099 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 3481
I20250624 02:15:22.227906 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:35121
--local_ip_for_outbound_sockets=127.31.42.194
--tserver_master_addrs=127.31.42.254:39985
--webserver_port=39475
--webserver_interface=127.31.42.194
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:15:22.229051 3524 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=12 overwritten=0 applied=12 ignored=0} inserts{seen=300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:22.230158 3524 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap complete.
I20250624 02:15:22.231909 3524 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Time spent bootstrapping tablet: real 0.212s user 0.158s sys 0.047s
I20250624 02:15:22.257578 3524 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:22.261201 3524 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Initialized, Role: FOLLOWER
I20250624 02:15:22.262363 3524 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 12, Last appended: 2.12, Last appended by leader: 12, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:22.272814 3524 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0: Time spent starting tablet: real 0.041s user 0.041s sys 0.000s
I20250624 02:15:22.273671 3524 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap starting.
I20250624 02:15:22.275848 3632 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:15:22.276348 3632 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:22.277611 3632 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:15:22.283221 3444 ts_manager.cc:194] Registered new tserver with Master: a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
I20250624 02:15:22.288183 3444 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 reported cstate change: config changed from index -1 to 11, term changed from 0 to 2, VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) added, VOTER 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194) added, VOTER a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) added. New cstate: current_term: 2 committed_config { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } } }
I20250624 02:15:22.363765 3444 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:58799
I20250624 02:15:22.368430 3632 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
I20250624 02:15:22.401568 3524 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap replayed 1/1 log segments. Stats: ops{read=11 overwritten=0 applied=11 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:22.402316 3524 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Bootstrap complete.
I20250624 02:15:22.403618 3524 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Time spent bootstrapping tablet: real 0.130s user 0.113s sys 0.012s
I20250624 02:15:22.405596 3524 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:22.406113 3524 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: a419c27a5f5b4eaa819ede2e09199dc0, State: Initialized, Role: FOLLOWER
I20250624 02:15:22.406700 3524 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11, Last appended: 2.11, Last appended by leader: 11, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:22.408284 3524 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Time spent starting tablet: real 0.004s user 0.004s sys 0.000s
W20250624 02:15:22.624117 3633 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:22.624639 3633 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:22.625154 3633 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:22.657423 3633 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:22.658360 3633 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:15:22.693657 3633 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:35121
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=39475
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:22.694993 3633 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:22.696609 3633 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:22.713815 3648 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:22.714202 3649 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:23.998015 3651 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:24.000308 3650 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1282 milliseconds
W20250624 02:15:24.001101 3633 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.291s user 0.420s sys 0.860s
W20250624 02:15:24.001463 3633 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.292s user 0.420s sys 0.860s
I20250624 02:15:24.001770 3633 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:15:24.003330 3633 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:24.006325 3633 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:24.007836 3633 hybrid_clock.cc:648] HybridClock initialized: now 1750731324007773 us; error 59 us; skew 500 ppm
I20250624 02:15:24.009078 3633 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:24.018695 3633 webserver.cc:469] Webserver started at http://127.31.42.194:39475/ using document root <none> and password file <none>
I20250624 02:15:24.020237 3633 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:24.020576 3633 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:24.030141 3658 raft_consensus.cc:491] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:15:24.030697 3658 raft_consensus.cc:513] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:24.033097 3633 fs_manager.cc:714] Time spent opening directory manager: real 0.007s user 0.005s sys 0.001s
I20250624 02:15:24.033596 3658 leader_election.cc:290] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985), 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121)
W20250624 02:15:24.038570 3519 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
W20250624 02:15:24.038560 3520 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.31.42.194:35121: connect: Connection refused (error 111)
I20250624 02:15:24.042287 3662 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:24.043845 3633 fs_manager.cc:730] Time spent opening block manager: real 0.007s user 0.000s sys 0.007s
W20250624 02:15:24.044611 3519 leader_election.cc:336] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
W20250624 02:15:24.045183 3520 leader_election.cc:336] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121): Network error: Client connection negotiation failed: client connection to 127.31.42.194:35121: connect: Connection refused (error 111)
I20250624 02:15:24.045600 3520 leader_election.cc:304] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a419c27a5f5b4eaa819ede2e09199dc0; no voters: 5a801f07eaeb46368d7fc28f51b9e1d6, 6431b2b3e46e4e199cfb1609d7c42607
I20250624 02:15:24.044230 3633 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "6431b2b3e46e4e199cfb1609d7c42607"
format_stamp: "Formatted at 2025-06-24 02:14:53 on dist-test-slave-5k9r"
I20250624 02:15:24.046964 3658 raft_consensus.cc:2747] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20250624 02:15:24.046845 3633 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:24.120774 3633 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:24.122310 3633 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:24.122761 3633 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:24.125313 3633 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:24.131093 3670 ts_tablet_manager.cc:542] Loading tablet metadata (0/3 complete)
I20250624 02:15:24.147512 3633 ts_tablet_manager.cc:579] Loaded tablet metadata (3 total tablets, 3 live tablets)
I20250624 02:15:24.147769 3633 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.018s user 0.002s sys 0.000s
I20250624 02:15:24.148133 3633 ts_tablet_manager.cc:594] Registering tablets (0/3 complete)
I20250624 02:15:24.153497 3670 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:15:24.159464 3633 ts_tablet_manager.cc:610] Registered 3 tablets
I20250624 02:15:24.159685 3633 ts_tablet_manager.cc:589] Time spent register tablets: real 0.012s user 0.011s sys 0.000s
I20250624 02:15:24.207979 3670 log.cc:826] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:24.272912 3658 raft_consensus.cc:491] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:15:24.273471 3658 raft_consensus.cc:513] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:24.278357 3658 leader_election.cc:290] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121), 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
W20250624 02:15:24.281283 3520 leader_election.cc:336] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121): Network error: Client connection negotiation failed: client connection to 127.31.42.194:35121: connect: Connection refused (error 111)
W20250624 02:15:24.281967 3519 leader_election.cc:336] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
I20250624 02:15:24.282405 3519 leader_election.cc:304] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: a419c27a5f5b4eaa819ede2e09199dc0; no voters: 5a801f07eaeb46368d7fc28f51b9e1d6, 6431b2b3e46e4e199cfb1609d7c42607
I20250624 02:15:24.283607 3658 raft_consensus.cc:2747] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Leader pre-election lost for term 3. Reason: could not achieve majority
I20250624 02:15:24.335390 3633 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:35121
I20250624 02:15:24.335515 3777 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:35121 every 8 connection(s)
I20250624 02:15:24.338879 3633 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:15:24.348726 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 3633
I20250624 02:15:24.351049 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:42985
--local_ip_for_outbound_sockets=127.31.42.195
--tserver_master_addrs=127.31.42.254:39985
--webserver_port=39881
--webserver_interface=127.31.42.195
--builtin_ntp_servers=127.31.42.212:37039
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
I20250624 02:15:24.356264 3670 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap replayed 1/1 log segments. Stats: ops{read=12 overwritten=0 applied=12 ignored=0} inserts{seen=300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:24.357148 3670 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap complete.
I20250624 02:15:24.358641 3670 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.206s user 0.159s sys 0.044s
I20250624 02:15:24.366511 3778 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:15:24.366930 3778 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:24.367883 3778 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:15:24.371340 3443 ts_manager.cc:194] Registered new tserver with Master: 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194:35121)
I20250624 02:15:24.375272 3443 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:43217
I20250624 02:15:24.376636 3670 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:24.380643 3670 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: FOLLOWER
I20250624 02:15:24.381650 3670 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 12, Last appended: 2.12, Last appended by leader: 12, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:24.384671 3778 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
I20250624 02:15:24.385707 3670 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.027s user 0.026s sys 0.000s
I20250624 02:15:24.386468 3670 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:15:24.510363 3670 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap replayed 1/1 log segments. Stats: ops{read=11 overwritten=0 applied=11 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:24.511114 3670 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap complete.
I20250624 02:15:24.513032 3670 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.127s user 0.108s sys 0.015s
I20250624 02:15:24.514688 3670 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:24.515147 3670 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: FOLLOWER
I20250624 02:15:24.515673 3670 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11, Last appended: 2.11, Last appended by leader: 11, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:24.517189 3670 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:24.517810 3670 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap starting.
I20250624 02:15:24.600193 3670 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap replayed 1/1 log segments. Stats: ops{read=7 overwritten=0 applied=7 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:24.600863 3670 tablet_bootstrap.cc:492] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Bootstrap complete.
I20250624 02:15:24.602118 3670 ts_tablet_manager.cc:1397] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Time spent bootstrapping tablet: real 0.084s user 0.073s sys 0.011s
I20250624 02:15:24.603686 3670 raft_consensus.cc:357] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:24.604068 3670 raft_consensus.cc:738] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Initialized, Role: FOLLOWER
I20250624 02:15:24.604552 3670 consensus_queue.cc:260] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 7, Last appended: 2.7, Last appended by leader: 7, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:24.604962 3670 raft_consensus.cc:397] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:24.605222 3670 raft_consensus.cc:491] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:24.605523 3670 raft_consensus.cc:3058] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Advancing to term 3
I20250624 02:15:24.610603 3670 raft_consensus.cc:513] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:24.611264 3670 leader_election.cc:304] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607; no voters:
I20250624 02:15:24.611829 3670 leader_election.cc:290] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: Requested vote from peers
I20250624 02:15:24.612120 3783 raft_consensus.cc:2802] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 FOLLOWER]: Leader election won for term 3
I20250624 02:15:24.615216 3670 ts_tablet_manager.cc:1428] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607: Time spent starting tablet: real 0.013s user 0.012s sys 0.000s
I20250624 02:15:24.615816 3783 raft_consensus.cc:695] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 LEADER]: Becoming Leader. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Running, Role: LEADER
I20250624 02:15:24.616698 3783 consensus_queue.cc:237] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 7, Committed index: 7, Last appended: 2.7, Last appended by leader: 7, Current term: 3, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } }
I20250624 02:15:24.627274 3443 catalog_manager.cc:5582] T dde7659b8edb4c1ca042d33a2906f7f1 P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: term changed from 2 to 3. New cstate: current_term: 3 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } health_report { overall_health: HEALTHY } } }
W20250624 02:15:24.727885 3782 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:24.728394 3782 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:24.728899 3782 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:24.765966 3782 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:24.766853 3782 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:15:24.802574 3782 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:37039
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:42985
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=39881
--tserver_master_addrs=127.31.42.254:39985
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:24.803911 3782 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:24.805867 3782 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:24.823211 3799 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:25.612465 3805 raft_consensus.cc:491] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:15:25.612871 3805 raft_consensus.cc:513] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:25.614943 3805 leader_election.cc:290] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985), a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
W20250624 02:15:25.619465 3665 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
W20250624 02:15:25.626781 3665 leader_election.cc:336] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
I20250624 02:15:25.635192 3587 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "529e58fc77b545c388e07e325ebb525a" candidate_uuid: "6431b2b3e46e4e199cfb1609d7c42607" candidate_term: 3 candidate_status { last_received { term: 2 index: 12 } } ignore_live_leader: false dest_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" is_pre_election: true
I20250624 02:15:25.635819 3587 raft_consensus.cc:2466] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 6431b2b3e46e4e199cfb1609d7c42607 in term 2.
I20250624 02:15:25.637200 3665 leader_election.cc:304] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607, a419c27a5f5b4eaa819ede2e09199dc0; no voters: 5a801f07eaeb46368d7fc28f51b9e1d6
I20250624 02:15:25.637832 3805 raft_consensus.cc:2802] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20250624 02:15:25.638186 3805 raft_consensus.cc:491] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:15:25.638471 3805 raft_consensus.cc:3058] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Advancing to term 3
I20250624 02:15:25.642762 3805 raft_consensus.cc:513] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 FOLLOWER]: Starting leader election with config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:25.644250 3805 leader_election.cc:290] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: Requested vote from peers 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985), a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677)
I20250624 02:15:25.646534 3587 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "529e58fc77b545c388e07e325ebb525a" candidate_uuid: "6431b2b3e46e4e199cfb1609d7c42607" candidate_term: 3 candidate_status { last_received { term: 2 index: 12 } } ignore_live_leader: false dest_uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
I20250624 02:15:25.647140 3587 raft_consensus.cc:3058] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Advancing to term 3
W20250624 02:15:25.648623 3665 leader_election.cc:336] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: RPC error from VoteRequest() call to peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
I20250624 02:15:25.654345 3587 raft_consensus.cc:2466] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 6431b2b3e46e4e199cfb1609d7c42607 in term 3.
I20250624 02:15:25.655303 3665 leader_election.cc:304] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607, a419c27a5f5b4eaa819ede2e09199dc0; no voters: 5a801f07eaeb46368d7fc28f51b9e1d6
I20250624 02:15:25.655893 3805 raft_consensus.cc:2802] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 FOLLOWER]: Leader election won for term 3
I20250624 02:15:25.656872 3805 raft_consensus.cc:695] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 LEADER]: Becoming Leader. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Running, Role: LEADER
I20250624 02:15:25.657723 3805 consensus_queue.cc:237] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 12, Committed index: 12, Last appended: 2.12, Last appended by leader: 12, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:25.667650 3443 catalog_manager.cc:5582] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: term changed from 2 to 3, leader changed from 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) to 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194). New cstate: current_term: 3 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } }
I20250624 02:15:25.766943 3805 raft_consensus.cc:491] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:15:25.767400 3805 raft_consensus.cc:513] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting pre-election with config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:25.768884 3805 leader_election.cc:290] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 pre-election: Requested pre-vote from peers a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677), 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:15:25.770105 3587 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "45341658889647099a604223ab78b6db" candidate_uuid: "6431b2b3e46e4e199cfb1609d7c42607" candidate_term: 3 candidate_status { last_received { term: 2 index: 11 } } ignore_live_leader: false dest_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" is_pre_election: true
I20250624 02:15:25.771028 3587 raft_consensus.cc:2466] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 6431b2b3e46e4e199cfb1609d7c42607 in term 2.
I20250624 02:15:25.772327 3665 leader_election.cc:304] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607, a419c27a5f5b4eaa819ede2e09199dc0; no voters:
I20250624 02:15:25.773425 3805 raft_consensus.cc:2802] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Leader pre-election won for term 3
I20250624 02:15:25.773707 3805 raft_consensus.cc:491] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
W20250624 02:15:25.773551 3665 leader_election.cc:336] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 pre-election: RPC error from VoteRequest() call to peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
I20250624 02:15:25.774046 3805 raft_consensus.cc:3058] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 2 FOLLOWER]: Advancing to term 3
I20250624 02:15:25.778105 3805 raft_consensus.cc:513] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 FOLLOWER]: Starting leader election with config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:25.779422 3805 leader_election.cc:290] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: Requested vote from peers a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677), 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:15:25.780638 3587 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "45341658889647099a604223ab78b6db" candidate_uuid: "6431b2b3e46e4e199cfb1609d7c42607" candidate_term: 3 candidate_status { last_received { term: 2 index: 11 } } ignore_live_leader: false dest_uuid: "a419c27a5f5b4eaa819ede2e09199dc0"
I20250624 02:15:25.781122 3587 raft_consensus.cc:3058] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 2 FOLLOWER]: Advancing to term 3
W20250624 02:15:25.785698 3665 leader_election.cc:336] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: RPC error from VoteRequest() call to peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111)
I20250624 02:15:25.786063 3587 raft_consensus.cc:2466] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 6431b2b3e46e4e199cfb1609d7c42607 in term 3.
I20250624 02:15:25.787076 3665 leader_election.cc:304] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [CANDIDATE]: Term 3 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 6431b2b3e46e4e199cfb1609d7c42607, a419c27a5f5b4eaa819ede2e09199dc0; no voters: 5a801f07eaeb46368d7fc28f51b9e1d6
I20250624 02:15:25.787809 3805 raft_consensus.cc:2802] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 FOLLOWER]: Leader election won for term 3
I20250624 02:15:25.788475 3805 raft_consensus.cc:695] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 LEADER]: Becoming Leader. State: Replica: 6431b2b3e46e4e199cfb1609d7c42607, State: Running, Role: LEADER
I20250624 02:15:25.789616 3805 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 11, Committed index: 11, Last appended: 2.11, Last appended by leader: 11, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:25.801081 3443 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: term changed from 2 to 3, leader changed from <none> to 6431b2b3e46e4e199cfb1609d7c42607 (127.31.42.194). New cstate: current_term: 3 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } health_report { overall_health: UNKNOWN } } }
W20250624 02:15:24.823551 3802 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:24.823552 3800 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:24.825753 3782 server_base.cc:1048] running on GCE node
I20250624 02:15:26.032347 3782 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:26.035187 3782 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:26.036649 3782 hybrid_clock.cc:648] HybridClock initialized: now 1750731326036613 us; error 55 us; skew 500 ppm
I20250624 02:15:26.037492 3782 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:26.041311 3587 raft_consensus.cc:1273] T 529e58fc77b545c388e07e325ebb525a P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Refusing update from remote peer 6431b2b3e46e4e199cfb1609d7c42607: Log matching property violated. Preceding OpId in replica: term: 2 index: 12. Preceding OpId from leader: term: 3 index: 13. (index mismatch)
I20250624 02:15:26.042912 3805 consensus_queue.cc:1035] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 13, Last known committed idx: 12, Time since last communication: 0.000s
I20250624 02:15:26.045081 3782 webserver.cc:469] Webserver started at http://127.31.42.195:39881/ using document root <none> and password file <none>
I20250624 02:15:26.046049 3782 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:26.046286 3782 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:26.058862 3782 fs_manager.cc:714] Time spent opening directory manager: real 0.008s user 0.006s sys 0.000s
W20250624 02:15:26.064965 3665 consensus_peers.cc:487] T 529e58fc77b545c388e07e325ebb525a P 6431b2b3e46e4e199cfb1609d7c42607 -> Peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Couldn't send request to peer 5a801f07eaeb46368d7fc28f51b9e1d6. Status: Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:26.068794 3825 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:26.070045 3782 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:26.070339 3782 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "5a801f07eaeb46368d7fc28f51b9e1d6"
format_stamp: "Formatted at 2025-06-24 02:14:55 on dist-test-slave-5k9r"
I20250624 02:15:26.072265 3782 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:26.129192 3782 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:26.131026 3782 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:26.131431 3782 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:26.133842 3782 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:26.139750 3832 ts_tablet_manager.cc:542] Loading tablet metadata (0/2 complete)
I20250624 02:15:26.151504 3782 ts_tablet_manager.cc:579] Loaded tablet metadata (2 total tablets, 2 live tablets)
I20250624 02:15:26.151780 3782 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.013s user 0.002s sys 0.000s
I20250624 02:15:26.152042 3782 ts_tablet_manager.cc:594] Registering tablets (0/2 complete)
I20250624 02:15:26.157450 3832 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap starting.
I20250624 02:15:26.160200 3782 ts_tablet_manager.cc:610] Registered 2 tablets
I20250624 02:15:26.160398 3782 ts_tablet_manager.cc:589] Time spent register tablets: real 0.008s user 0.007s sys 0.000s
W20250624 02:15:26.186561 3665 consensus_peers.cc:487] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 -> Peer 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): Couldn't send request to peer 5a801f07eaeb46368d7fc28f51b9e1d6. Status: Network error: Client connection negotiation failed: client connection to 127.31.42.195:42985: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250624 02:15:26.225829 3832 log.cc:826] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:26.265998 3587 raft_consensus.cc:1273] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Refusing update from remote peer 6431b2b3e46e4e199cfb1609d7c42607: Log matching property violated. Preceding OpId in replica: term: 2 index: 11. Preceding OpId from leader: term: 3 index: 12. (index mismatch)
I20250624 02:15:26.267546 3805 consensus_queue.cc:1035] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 12, Last known committed idx: 11, Time since last communication: 0.000s
I20250624 02:15:26.315744 3733 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 12, Committed index: 12, Last appended: 3.12, Last appended by leader: 11, Current term: 3, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 13 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:26.321127 3587 raft_consensus.cc:1273] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Refusing update from remote peer 6431b2b3e46e4e199cfb1609d7c42607: Log matching property violated. Preceding OpId in replica: term: 3 index: 12. Preceding OpId from leader: term: 3 index: 13. (index mismatch)
I20250624 02:15:26.322788 3810 consensus_queue.cc:1035] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Connected to new peer: Peer: permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 13, Last known committed idx: 12, Time since last communication: 0.001s
I20250624 02:15:26.329798 3805 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 LEADER]: Committing config change with OpId 3.13: config changed from index 11 to 13, VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) evicted. New config: { opid_index: 13 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:26.331547 3587 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Committing config change with OpId 3.13: config changed from index 11 to 13, VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) evicted. New config: { opid_index: 13 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:26.340601 3429 catalog_manager.cc:5095] ChangeConfig:REMOVE_PEER RPC for tablet 45341658889647099a604223ab78b6db with cas_config_opid_index 11: ChangeConfig:REMOVE_PEER succeeded (attempt 1)
I20250624 02:15:26.344933 3444 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: config changed from index 11 to 13, VOTER 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195) evicted. New cstate: current_term: 3 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: 13 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } }
I20250624 02:15:26.374213 3832 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap replayed 1/1 log segments. Stats: ops{read=12 overwritten=0 applied=12 ignored=0} inserts{seen=300 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:26.375414 3832 tablet_bootstrap.cc:492] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap complete.
I20250624 02:15:26.377707 3832 ts_tablet_manager.cc:1397] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent bootstrapping tablet: real 0.221s user 0.177s sys 0.028s
W20250624 02:15:26.383996 3444 catalog_manager.cc:5774] Failed to send DeleteTablet RPC for tablet 45341658889647099a604223ab78b6db on TS 5a801f07eaeb46368d7fc28f51b9e1d6: Not found: failed to reset TS proxy: Could not find TS for UUID 5a801f07eaeb46368d7fc28f51b9e1d6
I20250624 02:15:26.391569 3733 consensus_queue.cc:237] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 13, Committed index: 13, Last appended: 3.13, Last appended by leader: 11, Current term: 3, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 14 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:26.394186 3810 raft_consensus.cc:2953] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 [term 3 LEADER]: Committing config change with OpId 3.14: config changed from index 13 to 14, VOTER a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) evicted. New config: { opid_index: 14 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } }
I20250624 02:15:26.403263 3429 catalog_manager.cc:5095] ChangeConfig:REMOVE_PEER RPC for tablet 45341658889647099a604223ab78b6db with cas_config_opid_index 13: ChangeConfig:REMOVE_PEER succeeded (attempt 1)
I20250624 02:15:26.400614 3832 raft_consensus.cc:357] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:26.404114 3832 raft_consensus.cc:738] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Initialized, Role: FOLLOWER
I20250624 02:15:26.405211 3832 consensus_queue.cc:260] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 12, Last appended: 2.12, Last appended by leader: 12, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 12 OBSOLETE_local: true peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } } peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } attrs { promote: false } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } }
I20250624 02:15:26.408640 3443 catalog_manager.cc:5582] T 45341658889647099a604223ab78b6db P 6431b2b3e46e4e199cfb1609d7c42607 reported cstate change: config changed from index 13 to 14, VOTER a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193) evicted. New cstate: current_term: 3 leader_uuid: "6431b2b3e46e4e199cfb1609d7c42607" committed_config { opid_index: 14 OBSOLETE_local: true peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } health_report { overall_health: HEALTHY } } }
I20250624 02:15:26.415378 3832 ts_tablet_manager.cc:1428] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent starting tablet: real 0.037s user 0.034s sys 0.000s
I20250624 02:15:26.416158 3832 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap starting.
W20250624 02:15:26.441025 3428 catalog_manager.cc:4726] Async tablet task DeleteTablet RPC for tablet 45341658889647099a604223ab78b6db on TS 5a801f07eaeb46368d7fc28f51b9e1d6 failed: Not found: failed to reset TS proxy: Could not find TS for UUID 5a801f07eaeb46368d7fc28f51b9e1d6
I20250624 02:15:26.443133 3567 tablet_service.cc:1515] Processing DeleteTablet for tablet 45341658889647099a604223ab78b6db with delete_type TABLET_DATA_TOMBSTONED (TS a419c27a5f5b4eaa819ede2e09199dc0 not found in new config with opid_index 14) from {username='slave'} at 127.0.0.1:34560
I20250624 02:15:26.447629 3934 tablet_replica.cc:331] stopping tablet replica
I20250624 02:15:26.448676 3934 raft_consensus.cc:2241] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Raft consensus shutting down.
I20250624 02:15:26.449359 3934 raft_consensus.cc:2270] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0 [term 3 FOLLOWER]: Raft consensus is shut down!
I20250624 02:15:26.453647 3934 ts_tablet_manager.cc:1905] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Deleting tablet data with delete state TABLET_DATA_TOMBSTONED
I20250624 02:15:26.470118 3934 ts_tablet_manager.cc:1918] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: tablet deleted with delete type TABLET_DATA_TOMBSTONED: last-logged OpId 3.13
I20250624 02:15:26.470546 3934 log.cc:1199] T 45341658889647099a604223ab78b6db P a419c27a5f5b4eaa819ede2e09199dc0: Deleting WAL directory at /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/wals/45341658889647099a604223ab78b6db
I20250624 02:15:26.472605 3428 catalog_manager.cc:4928] TS a419c27a5f5b4eaa819ede2e09199dc0 (127.31.42.193:39677): tablet 45341658889647099a604223ab78b6db (table TestTable [id=d6070d47d8534960b0c3b407ada1dcc2]) successfully deleted
I20250624 02:15:26.482403 3782 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:42985
I20250624 02:15:26.482520 3951 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:42985 every 8 connection(s)
I20250624 02:15:26.485848 3782 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:15:26.496376 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 3782
I20250624 02:15:26.509972 3952 heartbeater.cc:344] Connected to a master server at 127.31.42.254:39985
I20250624 02:15:26.510512 3952 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:26.511843 3952 heartbeater.cc:507] Master 127.31.42.254:39985 requested a full tablet report, sending...
I20250624 02:15:26.516355 3443 ts_manager.cc:194] Registered new tserver with Master: 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985)
I20250624 02:15:26.521063 3443 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:32807
I20250624 02:15:26.527750 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:15:26.532299 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
W20250624 02:15:26.535636 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
I20250624 02:15:26.548755 3832 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap replayed 1/1 log segments. Stats: ops{read=11 overwritten=0 applied=11 ignored=0} inserts{seen=250 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:15:26.549434 3832 tablet_bootstrap.cc:492] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Bootstrap complete.
I20250624 02:15:26.550740 3832 ts_tablet_manager.cc:1397] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent bootstrapping tablet: real 0.135s user 0.114s sys 0.016s
I20250624 02:15:26.552273 3832 raft_consensus.cc:357] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:26.552711 3832 raft_consensus.cc:738] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5a801f07eaeb46368d7fc28f51b9e1d6, State: Initialized, Role: FOLLOWER
I20250624 02:15:26.553174 3832 consensus_queue.cc:260] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 11, Last appended: 2.11, Last appended by leader: 11, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: 11 OBSOLETE_local: true peers { permanent_uuid: "a419c27a5f5b4eaa819ede2e09199dc0" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39677 } } peers { permanent_uuid: "6431b2b3e46e4e199cfb1609d7c42607" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 35121 } attrs { promote: false } } peers { permanent_uuid: "5a801f07eaeb46368d7fc28f51b9e1d6" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 42985 } attrs { promote: false } }
I20250624 02:15:26.554164 3952 heartbeater.cc:499] Master 127.31.42.254:39985 was elected leader, sending a full tablet report...
I20250624 02:15:26.554507 3832 ts_tablet_manager.cc:1428] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Time spent starting tablet: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:26.604648 3898 raft_consensus.cc:3058] T 529e58fc77b545c388e07e325ebb525a P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Advancing to term 3
I20250624 02:15:26.610184 3875 tablet_service.cc:1515] Processing DeleteTablet for tablet 45341658889647099a604223ab78b6db with delete_type TABLET_DATA_TOMBSTONED (TS 5a801f07eaeb46368d7fc28f51b9e1d6 not found in new config with opid_index 13) from {username='slave'} at 127.0.0.1:39468
I20250624 02:15:26.618270 3960 tablet_replica.cc:331] stopping tablet replica
I20250624 02:15:26.619432 3960 raft_consensus.cc:2241] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Raft consensus shutting down.
I20250624 02:15:26.620038 3960 raft_consensus.cc:2270] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6 [term 2 FOLLOWER]: Raft consensus is shut down!
I20250624 02:15:26.623349 3960 ts_tablet_manager.cc:1905] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Deleting tablet data with delete state TABLET_DATA_TOMBSTONED
I20250624 02:15:26.636374 3960 ts_tablet_manager.cc:1918] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: tablet deleted with delete type TABLET_DATA_TOMBSTONED: last-logged OpId 2.11
I20250624 02:15:26.636677 3960 log.cc:1199] T 45341658889647099a604223ab78b6db P 5a801f07eaeb46368d7fc28f51b9e1d6: Deleting WAL directory at /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/wals/45341658889647099a604223ab78b6db
I20250624 02:15:26.637980 3428 catalog_manager.cc:4928] TS 5a801f07eaeb46368d7fc28f51b9e1d6 (127.31.42.195:42985): tablet 45341658889647099a604223ab78b6db (table TestTable [id=d6070d47d8534960b0c3b407ada1dcc2]) successfully deleted
W20250624 02:15:27.539672 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:28.543680 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:29.547349 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:30.550657 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:31.554086 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:32.557435 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:33.561456 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:34.565303 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:35.568954 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:36.572741 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:37.576746 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:38.581311 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:39.584942 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:40.588392 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:41.592319 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:42.595952 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:43.599642 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:44.603348 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
W20250624 02:15:45.606536 31915 ts_itest-base.cc:209] found only 1 out of 3 replicas of tablet 45341658889647099a604223ab78b6db: tablet_id: "45341658889647099a604223ab78b6db" DEPRECATED_stale: false partition { partition_key_start: "" partition_key_end: "" } interned_replicas { ts_info_idx: 0 role: LEADER }
/home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/tools/kudu-admin-test.cc:3914: Failure
Failed
Bad status: Not found: not all replicas of tablets comprising table TestTable are registered yet
I20250624 02:15:46.609396 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 3481
I20250624 02:15:46.636925 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 3633
I20250624 02:15:46.666395 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 3782
I20250624 02:15:46.695008 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 3410
2025-06-24T02:15:46Z chronyd exiting
I20250624 02:15:46.744902 31915 test_util.cc:183] -----------------------------------------------
I20250624 02:15:46.745115 31915 test_util.cc:184] Had failures, leaving test files at /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.AdminCliTest.TestRebuildTables.1750731222455155-31915-0
[ FAILED ] AdminCliTest.TestRebuildTables (58547 ms)
[----------] 5 tests from AdminCliTest (124220 ms total)
[----------] 1 test from EnableKudu1097AndDownTS/MoveTabletParamTest
[ RUN ] EnableKudu1097AndDownTS/MoveTabletParamTest.Test/4
I20250624 02:15:46.748987 31915 test_util.cc:276] Using random seed: -402315740
I20250624 02:15:46.753458 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:15:46.753625 31915 ts_itest-base.cc:116] --------------
I20250624 02:15:46.753737 31915 ts_itest-base.cc:117] 5 tablet servers
I20250624 02:15:46.753840 31915 ts_itest-base.cc:118] 3 replicas per TS
I20250624 02:15:46.753963 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:15:46Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:15:46Z Disabled control of system clock
I20250624 02:15:46.801774 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:46799
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:43433
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:46799
--raft_prepare_replacement_before_eviction=true with env {}
W20250624 02:15:47.108980 3982 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:47.109566 3982 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:47.110054 3982 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:47.141216 3982 flags.cc:425] Enabled experimental flag: --raft_prepare_replacement_before_eviction=true
W20250624 02:15:47.141636 3982 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:15:47.141928 3982 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:47.142210 3982 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:15:47.142453 3982 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:15:47.178177 3982 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:43433
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:46799
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:46799
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:47.179488 3982 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:47.181156 3982 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:47.196889 3988 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:47.196923 3989 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:47.199505 3982 server_base.cc:1048] running on GCE node
W20250624 02:15:47.197216 3991 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:48.382625 3982 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:48.385793 3982 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:48.387315 3982 hybrid_clock.cc:648] HybridClock initialized: now 1750731348387254 us; error 81 us; skew 500 ppm
I20250624 02:15:48.388157 3982 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:48.394997 3982 webserver.cc:469] Webserver started at http://127.31.42.254:39033/ using document root <none> and password file <none>
I20250624 02:15:48.395915 3982 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:48.396116 3982 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:48.396584 3982 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:15:48.401065 3982 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "b5a500ff6a8843d8b9af5f6191eff7ce"
format_stamp: "Formatted at 2025-06-24 02:15:48 on dist-test-slave-5k9r"
I20250624 02:15:48.402209 3982 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "b5a500ff6a8843d8b9af5f6191eff7ce"
format_stamp: "Formatted at 2025-06-24 02:15:48 on dist-test-slave-5k9r"
I20250624 02:15:48.409271 3982 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.006s sys 0.001s
I20250624 02:15:48.414856 3998 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:48.415941 3982 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20250624 02:15:48.416280 3982 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "b5a500ff6a8843d8b9af5f6191eff7ce"
format_stamp: "Formatted at 2025-06-24 02:15:48 on dist-test-slave-5k9r"
I20250624 02:15:48.416623 3982 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:48.476694 3982 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:48.478178 3982 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:48.478597 3982 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:48.547927 3982 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:46799
I20250624 02:15:48.548020 4049 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:46799 every 8 connection(s)
I20250624 02:15:48.550581 3982 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:15:48.555693 4050 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:48.556061 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 3982
I20250624 02:15:48.556463 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:15:48.581590 4050 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce: Bootstrap starting.
I20250624 02:15:48.586818 4050 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce: Neither blocks nor log segments found. Creating new log.
I20250624 02:15:48.588559 4050 log.cc:826] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:48.593060 4050 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce: No bootstrap required, opened a new log
I20250624 02:15:48.610538 4050 raft_consensus.cc:357] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46799 } }
I20250624 02:15:48.611248 4050 raft_consensus.cc:383] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:15:48.611483 4050 raft_consensus.cc:738] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: b5a500ff6a8843d8b9af5f6191eff7ce, State: Initialized, Role: FOLLOWER
I20250624 02:15:48.612150 4050 consensus_queue.cc:260] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46799 } }
I20250624 02:15:48.612645 4050 raft_consensus.cc:397] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:15:48.612912 4050 raft_consensus.cc:491] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:15:48.613217 4050 raft_consensus.cc:3058] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:15:48.617779 4050 raft_consensus.cc:513] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46799 } }
I20250624 02:15:48.618505 4050 leader_election.cc:304] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: b5a500ff6a8843d8b9af5f6191eff7ce; no voters:
I20250624 02:15:48.620189 4050 leader_election.cc:290] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:15:48.620889 4055 raft_consensus.cc:2802] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:15:48.623127 4055 raft_consensus.cc:695] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [term 1 LEADER]: Becoming Leader. State: Replica: b5a500ff6a8843d8b9af5f6191eff7ce, State: Running, Role: LEADER
I20250624 02:15:48.623993 4055 consensus_queue.cc:237] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46799 } }
I20250624 02:15:48.624425 4050 sys_catalog.cc:564] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:15:48.635310 4057 sys_catalog.cc:455] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [sys.catalog]: SysCatalogTable state changed. Reason: New leader b5a500ff6a8843d8b9af5f6191eff7ce. Latest consensus state: current_term: 1 leader_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46799 } } }
I20250624 02:15:48.636828 4056 sys_catalog.cc:455] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "b5a500ff6a8843d8b9af5f6191eff7ce" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46799 } } }
I20250624 02:15:48.637465 4057 sys_catalog.cc:458] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:48.637549 4056 sys_catalog.cc:458] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce [sys.catalog]: This master's current role is: LEADER
I20250624 02:15:48.643306 4064 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:15:48.653616 4064 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:15:48.668452 4064 catalog_manager.cc:1349] Generated new cluster ID: c74d9237d20246b2830a85d76e6980f1
I20250624 02:15:48.668766 4064 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:15:48.693076 4064 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:15:48.694932 4064 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:15:48.714170 4064 catalog_manager.cc:5955] T 00000000000000000000000000000000 P b5a500ff6a8843d8b9af5f6191eff7ce: Generated new TSK 0
I20250624 02:15:48.715220 4064 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:15:48.738065 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--builtin_ntp_servers=127.31.42.212:43433
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_prepare_replacement_before_eviction=true with env {}
W20250624 02:15:49.047622 4074 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:49.048141 4074 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:49.048640 4074 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:49.080626 4074 flags.cc:425] Enabled experimental flag: --raft_prepare_replacement_before_eviction=true
W20250624 02:15:49.081024 4074 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:49.081775 4074 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:15:49.116293 4074 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:43433
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:49.117585 4074 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:49.119215 4074 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:49.136590 4081 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:49.138000 4080 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:49.138511 4074 server_base.cc:1048] running on GCE node
W20250624 02:15:49.138180 4083 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:50.315573 4074 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:50.318344 4074 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:50.319792 4074 hybrid_clock.cc:648] HybridClock initialized: now 1750731350319717 us; error 86 us; skew 500 ppm
I20250624 02:15:50.320858 4074 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:50.328608 4074 webserver.cc:469] Webserver started at http://127.31.42.193:41611/ using document root <none> and password file <none>
I20250624 02:15:50.329955 4074 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:50.330225 4074 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:50.330801 4074 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:15:50.337344 4074 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "45c477fce2ee43db880b2b211efe398c"
format_stamp: "Formatted at 2025-06-24 02:15:50 on dist-test-slave-5k9r"
I20250624 02:15:50.338788 4074 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "45c477fce2ee43db880b2b211efe398c"
format_stamp: "Formatted at 2025-06-24 02:15:50 on dist-test-slave-5k9r"
I20250624 02:15:50.347965 4074 fs_manager.cc:696] Time spent creating directory manager: real 0.008s user 0.004s sys 0.004s
I20250624 02:15:50.355151 4090 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:50.356281 4074 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.001s sys 0.003s
I20250624 02:15:50.356671 4074 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "45c477fce2ee43db880b2b211efe398c"
format_stamp: "Formatted at 2025-06-24 02:15:50 on dist-test-slave-5k9r"
I20250624 02:15:50.357107 4074 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:50.413643 4074 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:50.415153 4074 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:50.415585 4074 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:50.418298 4074 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:50.422657 4074 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:15:50.422883 4074 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:50.423137 4074 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:15:50.423295 4074 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:50.563271 4074 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:46613
I20250624 02:15:50.563380 4202 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:46613 every 8 connection(s)
I20250624 02:15:50.565778 4074 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:15:50.574426 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4074
I20250624 02:15:50.574815 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:15:50.580605 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--builtin_ntp_servers=127.31.42.212:43433
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_prepare_replacement_before_eviction=true with env {}
I20250624 02:15:50.590623 4203 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46799
I20250624 02:15:50.591058 4203 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:50.592042 4203 heartbeater.cc:507] Master 127.31.42.254:46799 requested a full tablet report, sending...
I20250624 02:15:50.594475 4015 ts_manager.cc:194] Registered new tserver with Master: 45c477fce2ee43db880b2b211efe398c (127.31.42.193:46613)
I20250624 02:15:50.596330 4015 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:48681
W20250624 02:15:50.879058 4207 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:50.879577 4207 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:50.880096 4207 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:50.910935 4207 flags.cc:425] Enabled experimental flag: --raft_prepare_replacement_before_eviction=true
W20250624 02:15:50.911360 4207 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:50.912147 4207 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:15:50.947558 4207 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:43433
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:50.948843 4207 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:50.950608 4207 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:50.967414 4214 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:51.599740 4203 heartbeater.cc:499] Master 127.31.42.254:46799 was elected leader, sending a full tablet report...
W20250624 02:15:50.967443 4216 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:50.969607 4207 server_base.cc:1048] running on GCE node
W20250624 02:15:50.968827 4213 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:52.135069 4207 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:52.137331 4207 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:52.138691 4207 hybrid_clock.cc:648] HybridClock initialized: now 1750731352138658 us; error 41 us; skew 500 ppm
I20250624 02:15:52.139503 4207 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:52.146495 4207 webserver.cc:469] Webserver started at http://127.31.42.194:37653/ using document root <none> and password file <none>
I20250624 02:15:52.147536 4207 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:52.147728 4207 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:52.148154 4207 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:15:52.152719 4207 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/instance:
uuid: "245cfd7eb220453dbf52ac90fb0b06a5"
format_stamp: "Formatted at 2025-06-24 02:15:52 on dist-test-slave-5k9r"
I20250624 02:15:52.153786 4207 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance:
uuid: "245cfd7eb220453dbf52ac90fb0b06a5"
format_stamp: "Formatted at 2025-06-24 02:15:52 on dist-test-slave-5k9r"
I20250624 02:15:52.160827 4207 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.001s sys 0.004s
I20250624 02:15:52.166697 4223 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:52.167726 4207 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.002s
I20250624 02:15:52.168057 4207 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
uuid: "245cfd7eb220453dbf52ac90fb0b06a5"
format_stamp: "Formatted at 2025-06-24 02:15:52 on dist-test-slave-5k9r"
I20250624 02:15:52.168426 4207 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:52.219499 4207 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:52.220964 4207 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:52.221416 4207 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:52.223943 4207 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:52.228058 4207 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:15:52.228271 4207 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:52.228533 4207 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:15:52.228694 4207 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:52.361223 4207 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:41411
I20250624 02:15:52.361321 4335 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:41411 every 8 connection(s)
I20250624 02:15:52.363763 4207 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/data/info.pb
I20250624 02:15:52.367365 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4207
I20250624 02:15:52.367801 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-1/wal/instance
I20250624 02:15:52.374775 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--builtin_ntp_servers=127.31.42.212:43433
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_prepare_replacement_before_eviction=true with env {}
I20250624 02:15:52.386132 4336 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46799
I20250624 02:15:52.386554 4336 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:52.387585 4336 heartbeater.cc:507] Master 127.31.42.254:46799 requested a full tablet report, sending...
I20250624 02:15:52.389928 4015 ts_manager.cc:194] Registered new tserver with Master: 245cfd7eb220453dbf52ac90fb0b06a5 (127.31.42.194:41411)
I20250624 02:15:52.391847 4015 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:46993
W20250624 02:15:52.682065 4340 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:52.682535 4340 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:52.682974 4340 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:52.714186 4340 flags.cc:425] Enabled experimental flag: --raft_prepare_replacement_before_eviction=true
W20250624 02:15:52.714548 4340 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:52.715282 4340 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:15:52.750142 4340 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:43433
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:52.751328 4340 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:52.752924 4340 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:52.774519 4347 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:53.395392 4336 heartbeater.cc:499] Master 127.31.42.254:46799 was elected leader, sending a full tablet report...
W20250624 02:15:52.775539 4349 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:52.775579 4346 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:53.947470 4348 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:15:53.947542 4340 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:15:53.952172 4340 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:53.954946 4340 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:53.956410 4340 hybrid_clock.cc:648] HybridClock initialized: now 1750731353956368 us; error 57 us; skew 500 ppm
I20250624 02:15:53.957247 4340 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:53.963394 4340 webserver.cc:469] Webserver started at http://127.31.42.195:40845/ using document root <none> and password file <none>
I20250624 02:15:53.964318 4340 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:53.964524 4340 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:53.964985 4340 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:15:53.969455 4340 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/instance:
uuid: "8ca056b4984e412582bb696f843d5964"
format_stamp: "Formatted at 2025-06-24 02:15:53 on dist-test-slave-5k9r"
I20250624 02:15:53.970597 4340 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance:
uuid: "8ca056b4984e412582bb696f843d5964"
format_stamp: "Formatted at 2025-06-24 02:15:53 on dist-test-slave-5k9r"
I20250624 02:15:53.977651 4340 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.006s sys 0.001s
I20250624 02:15:53.983466 4357 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:53.984548 4340 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.005s sys 0.000s
I20250624 02:15:53.984897 4340 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
uuid: "8ca056b4984e412582bb696f843d5964"
format_stamp: "Formatted at 2025-06-24 02:15:53 on dist-test-slave-5k9r"
I20250624 02:15:53.985231 4340 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:54.043283 4340 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:54.044852 4340 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:54.045324 4340 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:54.047905 4340 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:54.052260 4340 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:15:54.052456 4340 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:54.052737 4340 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:15:54.052891 4340 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:54.188638 4340 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:44709
I20250624 02:15:54.188740 4469 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:44709 every 8 connection(s)
I20250624 02:15:54.191195 4340 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/data/info.pb
I20250624 02:15:54.195580 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4340
I20250624 02:15:54.196491 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-2/wal/instance
I20250624 02:15:54.206346 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.196:0
--local_ip_for_outbound_sockets=127.31.42.196
--webserver_interface=127.31.42.196
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--builtin_ntp_servers=127.31.42.212:43433
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_prepare_replacement_before_eviction=true with env {}
I20250624 02:15:54.218057 4470 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46799
I20250624 02:15:54.218596 4470 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:54.219985 4470 heartbeater.cc:507] Master 127.31.42.254:46799 requested a full tablet report, sending...
I20250624 02:15:54.222473 4015 ts_manager.cc:194] Registered new tserver with Master: 8ca056b4984e412582bb696f843d5964 (127.31.42.195:44709)
I20250624 02:15:54.223683 4015 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:56579
W20250624 02:15:54.515193 4474 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:54.515712 4474 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:54.516216 4474 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:54.551499 4474 flags.cc:425] Enabled experimental flag: --raft_prepare_replacement_before_eviction=true
W20250624 02:15:54.552053 4474 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:54.553300 4474 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.196
I20250624 02:15:54.598217 4474 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:43433
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.196:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/info.pb
--webserver_interface=127.31.42.196
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.196
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:54.599423 4474 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:54.601060 4474 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:54.617341 4481 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:55.227358 4470 heartbeater.cc:499] Master 127.31.42.254:46799 was elected leader, sending a full tablet report...
W20250624 02:15:54.617348 4480 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:54.617396 4483 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:55.805014 4482 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:15:55.805127 4474 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:15:55.809305 4474 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:55.812041 4474 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:55.813463 4474 hybrid_clock.cc:648] HybridClock initialized: now 1750731355813409 us; error 71 us; skew 500 ppm
I20250624 02:15:55.814275 4474 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:55.827975 4474 webserver.cc:469] Webserver started at http://127.31.42.196:35879/ using document root <none> and password file <none>
I20250624 02:15:55.828892 4474 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:55.829090 4474 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:55.829488 4474 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:15:55.834002 4474 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/instance:
uuid: "099478c88e8a42f28b19b3fe6acbdd67"
format_stamp: "Formatted at 2025-06-24 02:15:55 on dist-test-slave-5k9r"
I20250624 02:15:55.835094 4474 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal/instance:
uuid: "099478c88e8a42f28b19b3fe6acbdd67"
format_stamp: "Formatted at 2025-06-24 02:15:55 on dist-test-slave-5k9r"
I20250624 02:15:55.842222 4474 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.001s
I20250624 02:15:55.847939 4490 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:55.849134 4474 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.005s sys 0.000s
I20250624 02:15:55.849458 4474 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
uuid: "099478c88e8a42f28b19b3fe6acbdd67"
format_stamp: "Formatted at 2025-06-24 02:15:55 on dist-test-slave-5k9r"
I20250624 02:15:55.849777 4474 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:55.897339 4474 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:55.898813 4474 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:55.899281 4474 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:55.901918 4474 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:55.906090 4474 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:15:55.906289 4474 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:55.906486 4474 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:15:55.906620 4474 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:56.042258 4474 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.196:38665
I20250624 02:15:56.042382 4602 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.196:38665 every 8 connection(s)
I20250624 02:15:56.044790 4474 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/data/info.pb
I20250624 02:15:56.045418 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4474
I20250624 02:15:56.045965 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-3/wal/instance
I20250624 02:15:56.054522 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.197:0
--local_ip_for_outbound_sockets=127.31.42.197
--webserver_interface=127.31.42.197
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--builtin_ntp_servers=127.31.42.212:43433
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--raft_prepare_replacement_before_eviction=true with env {}
I20250624 02:15:56.070804 4603 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46799
I20250624 02:15:56.071239 4603 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:56.072228 4603 heartbeater.cc:507] Master 127.31.42.254:46799 requested a full tablet report, sending...
I20250624 02:15:56.074498 4015 ts_manager.cc:194] Registered new tserver with Master: 099478c88e8a42f28b19b3fe6acbdd67 (127.31.42.196:38665)
I20250624 02:15:56.076300 4015 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.196:34173
W20250624 02:15:56.376883 4607 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:15:56.377342 4607 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:15:56.377777 4607 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:15:56.409752 4607 flags.cc:425] Enabled experimental flag: --raft_prepare_replacement_before_eviction=true
W20250624 02:15:56.410163 4607 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:15:56.410920 4607 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.197
I20250624 02:15:56.445298 4607 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:43433
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.197:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data/info.pb
--webserver_interface=127.31.42.197
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46799
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.197
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:15:56.446512 4607 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:15:56.448025 4607 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:15:56.463034 4613 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:15:57.079596 4603 heartbeater.cc:499] Master 127.31.42.254:46799 was elected leader, sending a full tablet report...
W20250624 02:15:56.463142 4614 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:56.464710 4616 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:15:57.642540 4615 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection time-out
I20250624 02:15:57.642618 4607 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:15:57.646266 4607 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:15:57.648451 4607 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:15:57.649837 4607 hybrid_clock.cc:648] HybridClock initialized: now 1750731357649778 us; error 74 us; skew 500 ppm
I20250624 02:15:57.650748 4607 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:15:57.657403 4607 webserver.cc:469] Webserver started at http://127.31.42.197:37107/ using document root <none> and password file <none>
I20250624 02:15:57.658403 4607 fs_manager.cc:362] Metadata directory not provided
I20250624 02:15:57.658600 4607 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:15:57.659054 4607 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:15:57.663621 4607 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data/instance:
uuid: "cfc380cfea2a405fbc345ab5adfaf441"
format_stamp: "Formatted at 2025-06-24 02:15:57 on dist-test-slave-5k9r"
I20250624 02:15:57.664664 4607 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal/instance:
uuid: "cfc380cfea2a405fbc345ab5adfaf441"
format_stamp: "Formatted at 2025-06-24 02:15:57 on dist-test-slave-5k9r"
I20250624 02:15:57.671674 4607 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.008s sys 0.000s
I20250624 02:15:57.677037 4624 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:57.678089 4607 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.003s sys 0.000s
I20250624 02:15:57.678381 4607 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal
uuid: "cfc380cfea2a405fbc345ab5adfaf441"
format_stamp: "Formatted at 2025-06-24 02:15:57 on dist-test-slave-5k9r"
I20250624 02:15:57.678681 4607 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:15:57.741894 4607 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:15:57.743484 4607 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:15:57.743914 4607 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:15:57.746407 4607 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:15:57.750468 4607 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:15:57.750756 4607 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.001s
I20250624 02:15:57.751050 4607 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:15:57.751225 4607 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:15:57.894960 4607 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.197:42029
I20250624 02:15:57.895037 4736 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.197:42029 every 8 connection(s)
I20250624 02:15:57.898821 4607 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/data/info.pb
I20250624 02:15:57.904587 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4607
I20250624 02:15:57.905169 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.EnableKudu1097AndDownTS_MoveTabletParamTest.Test_4.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-4/wal/instance
I20250624 02:15:57.920095 4737 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46799
I20250624 02:15:57.920499 4737 heartbeater.cc:461] Registering TS with master...
I20250624 02:15:57.921442 4737 heartbeater.cc:507] Master 127.31.42.254:46799 requested a full tablet report, sending...
I20250624 02:15:57.923705 4015 ts_manager.cc:194] Registered new tserver with Master: cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029)
I20250624 02:15:57.925197 4015 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.197:60375
I20250624 02:15:57.926236 31915 external_mini_cluster.cc:934] 5 TS(s) registered with all masters
I20250624 02:15:57.964226 4015 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:51790:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
owner: "alice"
I20250624 02:15:58.047902 4672 tablet_service.cc:1468] Processing CreateTablet for tablet 40d336f7b7f94a79943ced81ab6e5a5b (DEFAULT_TABLE table=TestTable [id=e5c74fd14a4545c99331ca7961bbc8f6]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:15:58.050096 4672 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40d336f7b7f94a79943ced81ab6e5a5b. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:58.050479 4138 tablet_service.cc:1468] Processing CreateTablet for tablet 40d336f7b7f94a79943ced81ab6e5a5b (DEFAULT_TABLE table=TestTable [id=e5c74fd14a4545c99331ca7961bbc8f6]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:15:58.052455 4138 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40d336f7b7f94a79943ced81ab6e5a5b. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:58.062062 4538 tablet_service.cc:1468] Processing CreateTablet for tablet 40d336f7b7f94a79943ced81ab6e5a5b (DEFAULT_TABLE table=TestTable [id=e5c74fd14a4545c99331ca7961bbc8f6]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:15:58.064153 4538 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 40d336f7b7f94a79943ced81ab6e5a5b. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:15:58.086483 4756 tablet_bootstrap.cc:492] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: Bootstrap starting.
I20250624 02:15:58.089115 4757 tablet_bootstrap.cc:492] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: Bootstrap starting.
I20250624 02:15:58.093230 4756 tablet_bootstrap.cc:654] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: Neither blocks nor log segments found. Creating new log.
I20250624 02:15:58.094992 4756 log.cc:826] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:58.096649 4757 tablet_bootstrap.cc:654] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: Neither blocks nor log segments found. Creating new log.
I20250624 02:15:58.097426 4758 tablet_bootstrap.cc:492] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: Bootstrap starting.
I20250624 02:15:58.099011 4757 log.cc:826] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:58.100970 4756 tablet_bootstrap.cc:492] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: No bootstrap required, opened a new log
I20250624 02:15:58.101547 4756 ts_tablet_manager.cc:1397] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: Time spent bootstrapping tablet: real 0.016s user 0.004s sys 0.010s
I20250624 02:15:58.108695 4758 tablet_bootstrap.cc:654] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: Neither blocks nor log segments found. Creating new log.
I20250624 02:15:58.111377 4758 log.cc:826] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: Log is configured to *not* fsync() on all Append() calls
I20250624 02:15:58.115180 4757 tablet_bootstrap.cc:492] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: No bootstrap required, opened a new log
I20250624 02:15:58.115618 4757 ts_tablet_manager.cc:1397] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: Time spent bootstrapping tablet: real 0.027s user 0.012s sys 0.012s
I20250624 02:15:58.118357 4758 tablet_bootstrap.cc:492] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: No bootstrap required, opened a new log
I20250624 02:15:58.118889 4758 ts_tablet_manager.cc:1397] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: Time spent bootstrapping tablet: real 0.022s user 0.005s sys 0.012s
I20250624 02:15:58.129537 4756 raft_consensus.cc:357] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.130527 4756 raft_consensus.cc:383] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:15:58.130887 4756 raft_consensus.cc:738] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: cfc380cfea2a405fbc345ab5adfaf441, State: Initialized, Role: FOLLOWER
I20250624 02:15:58.131778 4756 consensus_queue.cc:260] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.135264 4737 heartbeater.cc:499] Master 127.31.42.254:46799 was elected leader, sending a full tablet report...
I20250624 02:15:58.136686 4756 ts_tablet_manager.cc:1428] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: Time spent starting tablet: real 0.035s user 0.031s sys 0.001s
I20250624 02:15:58.136819 4757 raft_consensus.cc:357] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.138265 4757 raft_consensus.cc:383] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:15:58.138577 4757 raft_consensus.cc:738] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 45c477fce2ee43db880b2b211efe398c, State: Initialized, Role: FOLLOWER
I20250624 02:15:58.139411 4757 consensus_queue.cc:260] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.146106 4758 raft_consensus.cc:357] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.146842 4757 ts_tablet_manager.cc:1428] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: Time spent starting tablet: real 0.031s user 0.019s sys 0.010s
I20250624 02:15:58.147068 4758 raft_consensus.cc:383] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:15:58.147447 4758 raft_consensus.cc:738] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 099478c88e8a42f28b19b3fe6acbdd67, State: Initialized, Role: FOLLOWER
I20250624 02:15:58.148281 4758 consensus_queue.cc:260] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.153616 4758 ts_tablet_manager.cc:1428] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: Time spent starting tablet: real 0.034s user 0.023s sys 0.010s
W20250624 02:15:58.153723 4738 tablet.cc:2378] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 02:15:58.302793 4604 tablet.cc:2378] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 02:15:58.327672 4204 tablet.cc:2378] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:15:58.447686 4762 raft_consensus.cc:491] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:15:58.448173 4762 raft_consensus.cc:513] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.450513 4762 leader_election.cc:290] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 45c477fce2ee43db880b2b211efe398c (127.31.42.193:46613), 099478c88e8a42f28b19b3fe6acbdd67 (127.31.42.196:38665)
I20250624 02:15:58.462131 4158 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "40d336f7b7f94a79943ced81ab6e5a5b" candidate_uuid: "cfc380cfea2a405fbc345ab5adfaf441" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "45c477fce2ee43db880b2b211efe398c" is_pre_election: true
I20250624 02:15:58.462848 4158 raft_consensus.cc:2466] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate cfc380cfea2a405fbc345ab5adfaf441 in term 0.
I20250624 02:15:58.463851 4558 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "40d336f7b7f94a79943ced81ab6e5a5b" candidate_uuid: "cfc380cfea2a405fbc345ab5adfaf441" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "099478c88e8a42f28b19b3fe6acbdd67" is_pre_election: true
I20250624 02:15:58.464076 4626 leader_election.cc:304] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 45c477fce2ee43db880b2b211efe398c, cfc380cfea2a405fbc345ab5adfaf441; no voters:
I20250624 02:15:58.464787 4762 raft_consensus.cc:2802] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:15:58.464675 4558 raft_consensus.cc:2466] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate cfc380cfea2a405fbc345ab5adfaf441 in term 0.
I20250624 02:15:58.465073 4762 raft_consensus.cc:491] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:15:58.465322 4762 raft_consensus.cc:3058] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:15:58.470479 4762 raft_consensus.cc:513] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.471819 4762 leader_election.cc:290] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [CANDIDATE]: Term 1 election: Requested vote from peers 45c477fce2ee43db880b2b211efe398c (127.31.42.193:46613), 099478c88e8a42f28b19b3fe6acbdd67 (127.31.42.196:38665)
I20250624 02:15:58.472684 4158 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "40d336f7b7f94a79943ced81ab6e5a5b" candidate_uuid: "cfc380cfea2a405fbc345ab5adfaf441" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "45c477fce2ee43db880b2b211efe398c"
I20250624 02:15:58.472834 4558 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "40d336f7b7f94a79943ced81ab6e5a5b" candidate_uuid: "cfc380cfea2a405fbc345ab5adfaf441" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "099478c88e8a42f28b19b3fe6acbdd67"
I20250624 02:15:58.473240 4158 raft_consensus.cc:3058] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:15:58.473342 4558 raft_consensus.cc:3058] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:15:58.480204 4158 raft_consensus.cc:2466] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate cfc380cfea2a405fbc345ab5adfaf441 in term 1.
I20250624 02:15:58.480201 4558 raft_consensus.cc:2466] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate cfc380cfea2a405fbc345ab5adfaf441 in term 1.
I20250624 02:15:58.481209 4626 leader_election.cc:304] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 45c477fce2ee43db880b2b211efe398c, cfc380cfea2a405fbc345ab5adfaf441; no voters:
I20250624 02:15:58.481768 4762 raft_consensus.cc:2802] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:15:58.483315 4762 raft_consensus.cc:695] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [term 1 LEADER]: Becoming Leader. State: Replica: cfc380cfea2a405fbc345ab5adfaf441, State: Running, Role: LEADER
I20250624 02:15:58.484172 4762 consensus_queue.cc:237] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:15:58.495519 4015 catalog_manager.cc:5582] T 40d336f7b7f94a79943ced81ab6e5a5b P cfc380cfea2a405fbc345ab5adfaf441 reported cstate change: term changed from 0 to 1, leader changed from <none> to cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197). New cstate: current_term: 1 leader_uuid: "cfc380cfea2a405fbc345ab5adfaf441" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } health_report { overall_health: UNKNOWN } } }
I20250624 02:15:58.513067 31915 external_mini_cluster.cc:934] 5 TS(s) registered with all masters
I20250624 02:15:58.516621 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 45c477fce2ee43db880b2b211efe398c to finish bootstrapping
I20250624 02:15:58.528939 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 099478c88e8a42f28b19b3fe6acbdd67 to finish bootstrapping
I20250624 02:15:58.539533 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver cfc380cfea2a405fbc345ab5adfaf441 to finish bootstrapping
I20250624 02:15:58.550686 31915 test_util.cc:276] Using random seed: -390514046
I20250624 02:15:58.582695 31915 test_workload.cc:405] TestWorkload: Skipping table creation because table TestTable already exists
I20250624 02:15:58.588783 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4607
W20250624 02:15:58.622961 4774 negotiation.cc:337] Failed RPC negotiation. Trace:
0624 02:15:58.612066 (+ 0us) reactor.cc:625] Submitting negotiation task for client connection to 127.31.42.197:42029 (local address 127.0.0.1:55822)
0624 02:15:58.612813 (+ 747us) negotiation.cc:107] Waiting for socket to connect
0624 02:15:58.612872 (+ 59us) client_negotiation.cc:174] Beginning negotiation
0624 02:15:58.613090 (+ 218us) client_negotiation.cc:252] Sending NEGOTIATE NegotiatePB request
0624 02:15:58.621354 (+ 8264us) negotiation.cc:327] Negotiation complete: Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107)
Metrics: {"client-negotiator.queue_time_us":81}
W20250624 02:15:58.633644 4772 meta_cache.cc:302] tablet 40d336f7b7f94a79943ced81ab6e5a5b: replica cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029) has failed: Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: BlockingRecv error: recv error from unknown peer: Transport endpoint is not connected (error 107)
W20250624 02:15:58.656240 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:58.674655 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:58.699003 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:58.711395 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:58.740388 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:58.753437 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:58.783262 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:58.798239 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:58.838528 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:58.856437 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:58.901676 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:58.923934 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:58.978886 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.005350 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.067437 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.099520 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.165457 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.196851 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.266247 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.302124 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.374518 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.416076 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.502174 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.545095 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.635592 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.684963 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.738763 4772 meta_cache.cc:302] tablet 40d336f7b7f94a79943ced81ab6e5a5b: replica cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029) has failed: Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: connect: Connection refused (error 111)
W20250624 02:15:59.784993 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.833915 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:15:59.938216 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:15:59.988445 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
W20250624 02:16:00.098829 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
W20250624 02:16:00.151319 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
I20250624 02:16:00.234256 4789 raft_consensus.cc:491] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 1 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:16:00.234854 4789 raft_consensus.cc:513] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:16:00.238226 4789 leader_election.cc:290] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029), 099478c88e8a42f28b19b3fe6acbdd67 (127.31.42.196:38665)
W20250624 02:16:00.263281 4093 proxy.cc:239] Call had error, refreshing address and retrying: Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: connect: Connection refused (error 111)
I20250624 02:16:00.273347 4558 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "40d336f7b7f94a79943ced81ab6e5a5b" candidate_uuid: "45c477fce2ee43db880b2b211efe398c" candidate_term: 2 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "099478c88e8a42f28b19b3fe6acbdd67" is_pre_election: true
I20250624 02:16:00.274212 4558 raft_consensus.cc:2466] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 45c477fce2ee43db880b2b211efe398c in term 1.
I20250624 02:16:00.276347 4092 leader_election.cc:304] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 099478c88e8a42f28b19b3fe6acbdd67, 45c477fce2ee43db880b2b211efe398c; no voters:
I20250624 02:16:00.277688 4789 raft_consensus.cc:2802] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 1 FOLLOWER]: Leader pre-election won for term 2
I20250624 02:16:00.278162 4789 raft_consensus.cc:491] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 1 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:16:00.278548 4789 raft_consensus.cc:3058] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:16:00.289510 4789 raft_consensus.cc:513] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:16:00.293017 4789 leader_election.cc:290] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [CANDIDATE]: Term 2 election: Requested vote from peers cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029), 099478c88e8a42f28b19b3fe6acbdd67 (127.31.42.196:38665)
W20250624 02:16:00.293123 4118 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:37078: Illegal state: replica 45c477fce2ee43db880b2b211efe398c is not leader of this config: current role FOLLOWER
I20250624 02:16:00.296021 4558 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "40d336f7b7f94a79943ced81ab6e5a5b" candidate_uuid: "45c477fce2ee43db880b2b211efe398c" candidate_term: 2 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "099478c88e8a42f28b19b3fe6acbdd67"
I20250624 02:16:00.296763 4558 raft_consensus.cc:3058] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 1 FOLLOWER]: Advancing to term 2
W20250624 02:16:00.299134 4093 leader_election.cc:336] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [CANDIDATE]: Term 2 election: RPC error from VoteRequest() call to peer cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029): Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: connect: Connection refused (error 111)
W20250624 02:16:00.299726 4093 leader_election.cc:336] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [CANDIDATE]: Term 2 pre-election: RPC error from VoteRequest() call to peer cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029): Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: connect: Connection refused (error 111)
I20250624 02:16:00.306802 4558 raft_consensus.cc:2466] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 45c477fce2ee43db880b2b211efe398c in term 2.
I20250624 02:16:00.308290 4092 leader_election.cc:304] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 099478c88e8a42f28b19b3fe6acbdd67, 45c477fce2ee43db880b2b211efe398c; no voters: cfc380cfea2a405fbc345ab5adfaf441
I20250624 02:16:00.309409 4789 raft_consensus.cc:2802] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:16:00.312309 4789 raft_consensus.cc:695] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [term 2 LEADER]: Becoming Leader. State: Replica: 45c477fce2ee43db880b2b211efe398c, State: Running, Role: LEADER
I20250624 02:16:00.313390 4789 consensus_queue.cc:237] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } }
I20250624 02:16:00.330117 4014 catalog_manager.cc:5582] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c reported cstate change: term changed from 1 to 2, leader changed from cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197) to 45c477fce2ee43db880b2b211efe398c (127.31.42.193). New cstate: current_term: 2 leader_uuid: "45c477fce2ee43db880b2b211efe398c" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "cfc380cfea2a405fbc345ab5adfaf441" member_type: VOTER last_known_addr { host: "127.31.42.197" port: 42029 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "45c477fce2ee43db880b2b211efe398c" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 46613 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 } health_report { overall_health: UNKNOWN } } }
W20250624 02:16:00.350082 4518 tablet_service.cc:696] failed op from {username='slave'} at 127.0.0.1:56536: Illegal state: replica 099478c88e8a42f28b19b3fe6acbdd67 is not leader of this config: current role FOLLOWER
I20250624 02:16:00.422246 4558 raft_consensus.cc:1273] T 40d336f7b7f94a79943ced81ab6e5a5b P 099478c88e8a42f28b19b3fe6acbdd67 [term 2 FOLLOWER]: Refusing update from remote peer 45c477fce2ee43db880b2b211efe398c: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 2 index: 2. (index mismatch)
W20250624 02:16:00.426723 4093 consensus_peers.cc:487] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c -> Peer cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029): Couldn't send request to peer cfc380cfea2a405fbc345ab5adfaf441. Status: Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: connect: Connection refused (error 111). This is attempt 1: this message will repeat every 5th retry.
I20250624 02:16:00.427273 4789 consensus_queue.cc:1035] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c [LEADER]: Connected to new peer: Peer: permanent_uuid: "099478c88e8a42f28b19b3fe6acbdd67" member_type: VOTER last_known_addr { host: "127.31.42.196" port: 38665 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:16:00.446269 4804 mvcc.cc:204] Tried to move back new op lower bound from 7170995652271067136 to 7170995651853135872. Current Snapshot: MvccSnapshot[applied={T|T < 7170995652271067136}]
I20250624 02:16:00.454099 4806 mvcc.cc:204] Tried to move back new op lower bound from 7170995652271067136 to 7170995651853135872. Current Snapshot: MvccSnapshot[applied={T|T < 7170995652271067136}]
I20250624 02:16:00.825695 4405 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:00.833360 4538 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:00.837404 4138 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250624 02:16:00.866881 4271 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:02.604053 4538 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:02.671766 4271 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:02.676488 4138 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250624 02:16:02.681452 4405 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
W20250624 02:16:02.885046 4093 consensus_peers.cc:487] T 40d336f7b7f94a79943ced81ab6e5a5b P 45c477fce2ee43db880b2b211efe398c -> Peer cfc380cfea2a405fbc345ab5adfaf441 (127.31.42.197:42029): Couldn't send request to peer cfc380cfea2a405fbc345ab5adfaf441. Status: Network error: Client connection negotiation failed: client connection to 127.31.42.197:42029: connect: Connection refused (error 111). This is attempt 6: this message will repeat every 5th retry.
I20250624 02:16:05.127895 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4074
I20250624 02:16:05.169508 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4207
I20250624 02:16:05.194624 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4340
I20250624 02:16:05.219638 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4474
I20250624 02:16:05.253883 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 3982
2025-06-24T02:16:05Z chronyd exiting
[ OK ] EnableKudu1097AndDownTS/MoveTabletParamTest.Test/4 (18567 ms)
[----------] 1 test from EnableKudu1097AndDownTS/MoveTabletParamTest (18567 ms total)
[----------] 1 test from ListTableCliSimpleParamTest
[ RUN ] ListTableCliSimpleParamTest.TestListTables/2
I20250624 02:16:05.316897 31915 test_util.cc:276] Using random seed: -383747836
I20250624 02:16:05.321213 31915 ts_itest-base.cc:115] Starting cluster with:
I20250624 02:16:05.321375 31915 ts_itest-base.cc:116] --------------
I20250624 02:16:05.321543 31915 ts_itest-base.cc:117] 1 tablet servers
I20250624 02:16:05.321688 31915 ts_itest-base.cc:118] 1 replicas per TS
I20250624 02:16:05.321835 31915 ts_itest-base.cc:119] --------------
2025-06-24T02:16:05Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:16:05Z Disabled control of system clock
I20250624 02:16:05.369696 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:46315
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:44657
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:46315 with env {}
W20250624 02:16:05.690389 4888 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:05.690977 4888 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:05.691390 4888 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:05.722666 4888 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:16:05.722970 4888 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:05.723176 4888 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:16:05.723371 4888 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:16:05.758746 4888 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44657
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:46315
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:46315
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:05.760003 4888 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:05.761616 4888 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:05.777226 4895 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:05.777226 4894 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:05.777539 4897 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:05.779896 4888 server_base.cc:1048] running on GCE node
I20250624 02:16:06.966010 4888 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:06.969178 4888 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:06.970618 4888 hybrid_clock.cc:648] HybridClock initialized: now 1750731366970571 us; error 64 us; skew 500 ppm
I20250624 02:16:06.971449 4888 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:06.978463 4888 webserver.cc:469] Webserver started at http://127.31.42.254:33095/ using document root <none> and password file <none>
I20250624 02:16:06.979439 4888 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:06.979640 4888 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:06.980167 4888 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:06.986419 4888 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/instance:
uuid: "9b0baa89bbaf4557a125b8fc0f9797f7"
format_stamp: "Formatted at 2025-06-24 02:16:06 on dist-test-slave-5k9r"
I20250624 02:16:06.987591 4888 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance:
uuid: "9b0baa89bbaf4557a125b8fc0f9797f7"
format_stamp: "Formatted at 2025-06-24 02:16:06 on dist-test-slave-5k9r"
I20250624 02:16:06.994968 4888 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.007s sys 0.001s
I20250624 02:16:07.000685 4904 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:07.002125 4888 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.000s
I20250624 02:16:07.002463 4888 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
uuid: "9b0baa89bbaf4557a125b8fc0f9797f7"
format_stamp: "Formatted at 2025-06-24 02:16:06 on dist-test-slave-5k9r"
I20250624 02:16:07.002794 4888 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:07.054863 4888 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:07.056432 4888 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:07.056926 4888 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:07.129673 4888 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:46315
I20250624 02:16:07.129756 4955 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:46315 every 8 connection(s)
I20250624 02:16:07.132431 4888 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/data/info.pb
I20250624 02:16:07.137539 4956 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:07.139539 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4888
I20250624 02:16:07.140055 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/master-0/wal/instance
I20250624 02:16:07.158116 4956 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7: Bootstrap starting.
I20250624 02:16:07.164393 4956 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:07.166211 4956 log.cc:826] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:07.170796 4956 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7: No bootstrap required, opened a new log
I20250624 02:16:07.189791 4956 raft_consensus.cc:357] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46315 } }
I20250624 02:16:07.190513 4956 raft_consensus.cc:383] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:07.190747 4956 raft_consensus.cc:738] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 9b0baa89bbaf4557a125b8fc0f9797f7, State: Initialized, Role: FOLLOWER
I20250624 02:16:07.191380 4956 consensus_queue.cc:260] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46315 } }
I20250624 02:16:07.191859 4956 raft_consensus.cc:397] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:16:07.192104 4956 raft_consensus.cc:491] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:16:07.192356 4956 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:07.196444 4956 raft_consensus.cc:513] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46315 } }
I20250624 02:16:07.197141 4956 leader_election.cc:304] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 9b0baa89bbaf4557a125b8fc0f9797f7; no voters:
I20250624 02:16:07.199059 4956 leader_election.cc:290] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:16:07.199971 4961 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:16:07.202318 4961 raft_consensus.cc:695] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [term 1 LEADER]: Becoming Leader. State: Replica: 9b0baa89bbaf4557a125b8fc0f9797f7, State: Running, Role: LEADER
I20250624 02:16:07.203023 4961 consensus_queue.cc:237] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46315 } }
I20250624 02:16:07.203447 4956 sys_catalog.cc:564] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:16:07.219050 4963 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 9b0baa89bbaf4557a125b8fc0f9797f7. Latest consensus state: current_term: 1 leader_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46315 } } }
I20250624 02:16:07.219707 4963 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:07.217720 4962 sys_catalog.cc:455] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "9b0baa89bbaf4557a125b8fc0f9797f7" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 46315 } } }
I20250624 02:16:07.221493 4962 sys_catalog.cc:458] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7 [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:07.222579 4971 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:16:07.233757 4971 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:16:07.251819 4971 catalog_manager.cc:1349] Generated new cluster ID: 4144605d2d9040ea88188509670263d2
I20250624 02:16:07.252229 4971 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:16:07.268420 4971 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:16:07.270004 4971 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:16:07.288232 4971 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 9b0baa89bbaf4557a125b8fc0f9797f7: Generated new TSK 0
I20250624 02:16:07.289427 4971 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:16:07.305132 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46315
--builtin_ntp_servers=127.31.42.212:44657
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--consensus_rpc_timeout_ms=30000 with env {}
W20250624 02:16:07.655519 4980 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:07.656031 4980 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:07.656538 4980 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:07.688776 4980 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:07.689646 4980 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:16:07.725284 4980 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:44657
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--log_cache_size_limit_mb=10
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:46315
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:07.726614 4980 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:07.728382 4980 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:07.746860 4987 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:07.753357 4980 server_base.cc:1048] running on GCE node
W20250624 02:16:07.753230 4989 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:07.752208 4986 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:08.946894 4980 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:08.949860 4980 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:08.951439 4980 hybrid_clock.cc:648] HybridClock initialized: now 1750731368951400 us; error 37 us; skew 500 ppm
I20250624 02:16:08.952518 4980 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:08.966277 4980 webserver.cc:469] Webserver started at http://127.31.42.193:43283/ using document root <none> and password file <none>
I20250624 02:16:08.967562 4980 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:08.967851 4980 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:08.968453 4980 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:08.975188 4980 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/instance:
uuid: "7f662683c78e448e8bdef6b7c67cc59a"
format_stamp: "Formatted at 2025-06-24 02:16:08 on dist-test-slave-5k9r"
I20250624 02:16:08.976691 4980 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance:
uuid: "7f662683c78e448e8bdef6b7c67cc59a"
format_stamp: "Formatted at 2025-06-24 02:16:08 on dist-test-slave-5k9r"
I20250624 02:16:08.986202 4980 fs_manager.cc:696] Time spent creating directory manager: real 0.009s user 0.008s sys 0.004s
I20250624 02:16:08.993878 4996 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:08.995258 4980 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 02:16:08.995684 4980 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
uuid: "7f662683c78e448e8bdef6b7c67cc59a"
format_stamp: "Formatted at 2025-06-24 02:16:08 on dist-test-slave-5k9r"
I20250624 02:16:08.996155 4980 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:09.083976 4980 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:09.085909 4980 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:09.086468 4980 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:09.089691 4980 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:09.095261 4980 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:16:09.095544 4980 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:09.095876 4980 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:16:09.096083 4980 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:09.250110 4980 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:45487
I20250624 02:16:09.250185 5108 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:45487 every 8 connection(s)
I20250624 02:16:09.253532 4980 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/data/info.pb
I20250624 02:16:09.262946 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 4980
I20250624 02:16:09.263360 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.ListTableCliSimpleParamTest.TestListTables_2.1750731222455155-31915-0/raft_consensus-itest-cluster/ts-0/wal/instance
I20250624 02:16:09.275229 5109 heartbeater.cc:344] Connected to a master server at 127.31.42.254:46315
I20250624 02:16:09.275660 5109 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:09.276679 5109 heartbeater.cc:507] Master 127.31.42.254:46315 requested a full tablet report, sending...
I20250624 02:16:09.279330 4921 ts_manager.cc:194] Registered new tserver with Master: 7f662683c78e448e8bdef6b7c67cc59a (127.31.42.193:45487)
I20250624 02:16:09.281630 4921 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:36069
I20250624 02:16:09.282536 31915 external_mini_cluster.cc:934] 1 TS(s) registered with all masters
I20250624 02:16:09.313118 4921 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:44260:
name: "TestTable"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 1
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
owner: "alice"
I20250624 02:16:09.371210 5044 tablet_service.cc:1468] Processing CreateTablet for tablet 8145758942e34b2a9831cbcb8a200f66 (DEFAULT_TABLE table=TestTable [id=1a46e288d69d488d8d6d3d5f6186d5fe]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:09.372633 5044 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 8145758942e34b2a9831cbcb8a200f66. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:09.393132 5124 tablet_bootstrap.cc:492] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a: Bootstrap starting.
I20250624 02:16:09.398572 5124 tablet_bootstrap.cc:654] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:09.400434 5124 log.cc:826] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:09.405700 5124 tablet_bootstrap.cc:492] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a: No bootstrap required, opened a new log
I20250624 02:16:09.406159 5124 ts_tablet_manager.cc:1397] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a: Time spent bootstrapping tablet: real 0.014s user 0.009s sys 0.004s
I20250624 02:16:09.423298 5124 raft_consensus.cc:357] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7f662683c78e448e8bdef6b7c67cc59a" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 45487 } }
I20250624 02:16:09.423869 5124 raft_consensus.cc:383] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:09.424105 5124 raft_consensus.cc:738] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7f662683c78e448e8bdef6b7c67cc59a, State: Initialized, Role: FOLLOWER
I20250624 02:16:09.424764 5124 consensus_queue.cc:260] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7f662683c78e448e8bdef6b7c67cc59a" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 45487 } }
I20250624 02:16:09.425259 5124 raft_consensus.cc:397] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:16:09.425522 5124 raft_consensus.cc:491] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:16:09.425838 5124 raft_consensus.cc:3058] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:09.430660 5124 raft_consensus.cc:513] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7f662683c78e448e8bdef6b7c67cc59a" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 45487 } }
I20250624 02:16:09.431371 5124 leader_election.cc:304] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 7f662683c78e448e8bdef6b7c67cc59a; no voters:
I20250624 02:16:09.433116 5124 leader_election.cc:290] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:16:09.433485 5126 raft_consensus.cc:2802] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:16:09.437073 5109 heartbeater.cc:499] Master 127.31.42.254:46315 was elected leader, sending a full tablet report...
I20250624 02:16:09.436481 5126 raft_consensus.cc:695] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [term 1 LEADER]: Becoming Leader. State: Replica: 7f662683c78e448e8bdef6b7c67cc59a, State: Running, Role: LEADER
I20250624 02:16:09.437649 5124 ts_tablet_manager.cc:1428] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a: Time spent starting tablet: real 0.031s user 0.030s sys 0.000s
I20250624 02:16:09.437932 5126 consensus_queue.cc:237] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7f662683c78e448e8bdef6b7c67cc59a" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 45487 } }
I20250624 02:16:09.450285 4921 catalog_manager.cc:5582] T 8145758942e34b2a9831cbcb8a200f66 P 7f662683c78e448e8bdef6b7c67cc59a reported cstate change: term changed from 0 to 1, leader changed from <none> to 7f662683c78e448e8bdef6b7c67cc59a (127.31.42.193). New cstate: current_term: 1 leader_uuid: "7f662683c78e448e8bdef6b7c67cc59a" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7f662683c78e448e8bdef6b7c67cc59a" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 45487 } health_report { overall_health: HEALTHY } } }
I20250624 02:16:09.474845 31915 external_mini_cluster.cc:934] 1 TS(s) registered with all masters
I20250624 02:16:09.477856 31915 ts_itest-base.cc:246] Waiting for 1 tablets on tserver 7f662683c78e448e8bdef6b7c67cc59a to finish bootstrapping
I20250624 02:16:12.155292 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4980
I20250624 02:16:12.180123 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 4888
2025-06-24T02:16:12Z chronyd exiting
[ OK ] ListTableCliSimpleParamTest.TestListTables/2 (6916 ms)
[----------] 1 test from ListTableCliSimpleParamTest (6916 ms total)
[----------] 1 test from ListTableCliParamTest
[ RUN ] ListTableCliParamTest.ListTabletWithPartitionInfo/4
I20250624 02:16:12.233219 31915 test_util.cc:276] Using random seed: -376831510
[ OK ] ListTableCliParamTest.ListTabletWithPartitionInfo/4 (11 ms)
[----------] 1 test from ListTableCliParamTest (11 ms total)
[----------] 1 test from IsSecure/SecureClusterAdminCliParamTest
[ RUN ] IsSecure/SecureClusterAdminCliParamTest.TestRebuildMaster/0
2025-06-24T02:16:12Z chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC -PRIVDROP -SCFILTER -SIGND +ASYNCDNS -NTS -SECHASH -IPV6 +DEBUG)
2025-06-24T02:16:12Z Disabled control of system clock
I20250624 02:16:12.285673 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:34813
--webserver_interface=127.31.42.254
--webserver_port=0
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:34813 with env {}
W20250624 02:16:12.588518 5152 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:12.589154 5152 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:12.589568 5152 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:12.620723 5152 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:16:12.621027 5152 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:12.621243 5152 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:16:12.621444 5152 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:16:12.657336 5152 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:34813
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:34813
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=0
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:12.658618 5152 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:12.660187 5152 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:12.676184 5159 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:12.676244 5158 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:12.677090 5161 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:12.678524 5152 server_base.cc:1048] running on GCE node
I20250624 02:16:13.847885 5152 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:13.850549 5152 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:13.851948 5152 hybrid_clock.cc:648] HybridClock initialized: now 1750731373851902 us; error 61 us; skew 500 ppm
I20250624 02:16:13.852739 5152 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:13.859609 5152 webserver.cc:469] Webserver started at http://127.31.42.254:36625/ using document root <none> and password file <none>
I20250624 02:16:13.860570 5152 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:13.860791 5152 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:13.861259 5152 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:13.865689 5152 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/instance:
uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa"
format_stamp: "Formatted at 2025-06-24 02:16:13 on dist-test-slave-5k9r"
I20250624 02:16:13.866845 5152 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal/instance:
uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa"
format_stamp: "Formatted at 2025-06-24 02:16:13 on dist-test-slave-5k9r"
I20250624 02:16:13.874187 5152 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.007s sys 0.001s
I20250624 02:16:13.879805 5169 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:13.880887 5152 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.005s sys 0.000s
I20250624 02:16:13.881234 5152 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa"
format_stamp: "Formatted at 2025-06-24 02:16:13 on dist-test-slave-5k9r"
I20250624 02:16:13.881573 5152 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:13.932889 5152 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:13.934456 5152 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:13.934926 5152 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:14.005880 5152 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:34813
I20250624 02:16:14.005999 5220 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:34813 every 8 connection(s)
I20250624 02:16:14.008594 5152 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb
I20250624 02:16:14.013706 5221 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:14.017962 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 5152
I20250624 02:16:14.018603 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal/instance
I20250624 02:16:14.038673 5221 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa: Bootstrap starting.
I20250624 02:16:14.044600 5221 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:14.046581 5221 log.cc:826] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:14.051143 5221 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa: No bootstrap required, opened a new log
I20250624 02:16:14.070392 5221 raft_consensus.cc:357] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:14.071187 5221 raft_consensus.cc:383] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:14.071493 5221 raft_consensus.cc:738] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 5f8d0811fc2a4e2fbc4d364741f1cbaa, State: Initialized, Role: FOLLOWER
I20250624 02:16:14.072355 5221 consensus_queue.cc:260] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:14.073055 5221 raft_consensus.cc:397] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:16:14.073310 5221 raft_consensus.cc:491] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:16:14.073565 5221 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:14.077598 5221 raft_consensus.cc:513] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:14.078310 5221 leader_election.cc:304] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 5f8d0811fc2a4e2fbc4d364741f1cbaa; no voters:
I20250624 02:16:14.079967 5221 leader_election.cc:290] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:16:14.080674 5226 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:16:14.083303 5226 raft_consensus.cc:695] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [term 1 LEADER]: Becoming Leader. State: Replica: 5f8d0811fc2a4e2fbc4d364741f1cbaa, State: Running, Role: LEADER
I20250624 02:16:14.084101 5221 sys_catalog.cc:564] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:16:14.084225 5226 consensus_queue.cc:237] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:14.097157 5228 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [sys.catalog]: SysCatalogTable state changed. Reason: New leader 5f8d0811fc2a4e2fbc4d364741f1cbaa. Latest consensus state: current_term: 1 leader_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } } }
I20250624 02:16:14.098088 5228 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:14.098219 5227 sys_catalog.cc:455] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "5f8d0811fc2a4e2fbc4d364741f1cbaa" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } } }
I20250624 02:16:14.098839 5227 sys_catalog.cc:458] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:14.101670 5236 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:16:14.114219 5236 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:16:14.129201 5236 catalog_manager.cc:1349] Generated new cluster ID: 95eb3201ed764dbbb2bd49b98fe2b75c
I20250624 02:16:14.129521 5236 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:16:14.162238 5236 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:16:14.163841 5236 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:16:14.180411 5236 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 5f8d0811fc2a4e2fbc4d364741f1cbaa: Generated new TSK 0
I20250624 02:16:14.181516 5236 catalog_manager.cc:1516] Initializing in-progress tserver states...
I20250624 02:16:14.199331 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:0
--local_ip_for_outbound_sockets=127.31.42.193
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:34813
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin with env {}
W20250624 02:16:14.510703 5245 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:14.511256 5245 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:14.511752 5245 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:14.542613 5245 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:14.543478 5245 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:16:14.579269 5245 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=0
--tserver_master_addrs=127.31.42.254:34813
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:14.580564 5245 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:14.582248 5245 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:14.600548 5252 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:14.603955 5245 server_base.cc:1048] running on GCE node
W20250624 02:16:14.602543 5251 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:14.603596 5254 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:15.795711 5245 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:15.798719 5245 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:15.800262 5245 hybrid_clock.cc:648] HybridClock initialized: now 1750731375800178 us; error 93 us; skew 500 ppm
I20250624 02:16:15.801157 5245 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:15.808871 5245 webserver.cc:469] Webserver started at http://127.31.42.193:39883/ using document root <none> and password file <none>
I20250624 02:16:15.809840 5245 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:15.810130 5245 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:15.810633 5245 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:15.815173 5245 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/instance:
uuid: "0971d3ff884d42dc87eb43e864ed86ca"
format_stamp: "Formatted at 2025-06-24 02:16:15 on dist-test-slave-5k9r"
I20250624 02:16:15.816579 5245 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal/instance:
uuid: "0971d3ff884d42dc87eb43e864ed86ca"
format_stamp: "Formatted at 2025-06-24 02:16:15 on dist-test-slave-5k9r"
I20250624 02:16:15.824537 5245 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.006s sys 0.003s
I20250624 02:16:15.830957 5261 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:15.832077 5245 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.002s sys 0.002s
I20250624 02:16:15.832386 5245 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
uuid: "0971d3ff884d42dc87eb43e864ed86ca"
format_stamp: "Formatted at 2025-06-24 02:16:15 on dist-test-slave-5k9r"
I20250624 02:16:15.832708 5245 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:15.898478 5245 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:15.900004 5245 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:15.900439 5245 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:15.903174 5245 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:15.907858 5245 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:16:15.908089 5245 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:15.908344 5245 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:16:15.908511 5245 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:16.046761 5245 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:39353
I20250624 02:16:16.046860 5373 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:39353 every 8 connection(s)
I20250624 02:16:16.049332 5245 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb
I20250624 02:16:16.056785 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 5245
I20250624 02:16:16.057313 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal/instance
I20250624 02:16:16.064455 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:0
--local_ip_for_outbound_sockets=127.31.42.194
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:34813
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin with env {}
I20250624 02:16:16.072415 5374 heartbeater.cc:344] Connected to a master server at 127.31.42.254:34813
I20250624 02:16:16.072849 5374 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:16.074196 5374 heartbeater.cc:507] Master 127.31.42.254:34813 requested a full tablet report, sending...
I20250624 02:16:16.077193 5186 ts_manager.cc:194] Registered new tserver with Master: 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353)
I20250624 02:16:16.079175 5186 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:50095
W20250624 02:16:16.369228 5378 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:16.369684 5378 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:16.370250 5378 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:16.401631 5378 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:16.402529 5378 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:16:16.438458 5378 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=0
--tserver_master_addrs=127.31.42.254:34813
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:16.439733 5378 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:16.441329 5378 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:16.456148 5384 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:17.083382 5374 heartbeater.cc:499] Master 127.31.42.254:34813 was elected leader, sending a full tablet report...
W20250624 02:16:16.456264 5385 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:16.459159 5387 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:17.625905 5386 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1168 milliseconds
I20250624 02:16:17.626051 5378 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:16:17.627180 5378 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:17.629841 5378 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:17.631279 5378 hybrid_clock.cc:648] HybridClock initialized: now 1750731377631241 us; error 62 us; skew 500 ppm
I20250624 02:16:17.632086 5378 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:17.638294 5378 webserver.cc:469] Webserver started at http://127.31.42.194:33751/ using document root <none> and password file <none>
I20250624 02:16:17.639170 5378 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:17.639349 5378 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:17.639753 5378 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:17.645692 5378 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/instance:
uuid: "2b0d9569e0d84ee097265b983b992ff9"
format_stamp: "Formatted at 2025-06-24 02:16:17 on dist-test-slave-5k9r"
I20250624 02:16:17.646766 5378 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal/instance:
uuid: "2b0d9569e0d84ee097265b983b992ff9"
format_stamp: "Formatted at 2025-06-24 02:16:17 on dist-test-slave-5k9r"
I20250624 02:16:17.653602 5378 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.006s sys 0.001s
I20250624 02:16:17.659058 5394 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:17.660072 5378 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.005s sys 0.000s
I20250624 02:16:17.660387 5378 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
uuid: "2b0d9569e0d84ee097265b983b992ff9"
format_stamp: "Formatted at 2025-06-24 02:16:17 on dist-test-slave-5k9r"
I20250624 02:16:17.660707 5378 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:17.713519 5378 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:17.715050 5378 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:17.715474 5378 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:17.718087 5378 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:17.722029 5378 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:16:17.722225 5378 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:17.722471 5378 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:16:17.722625 5378 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:17.854679 5378 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:40181
I20250624 02:16:17.854784 5506 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:40181 every 8 connection(s)
I20250624 02:16:17.857172 5378 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb
I20250624 02:16:17.862574 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 5378
I20250624 02:16:17.863173 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal/instance
I20250624 02:16:17.870077 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:0
--local_ip_for_outbound_sockets=127.31.42.195
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:34813
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin with env {}
I20250624 02:16:17.878573 5507 heartbeater.cc:344] Connected to a master server at 127.31.42.254:34813
I20250624 02:16:17.879096 5507 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:17.880468 5507 heartbeater.cc:507] Master 127.31.42.254:34813 requested a full tablet report, sending...
I20250624 02:16:17.882877 5186 ts_manager.cc:194] Registered new tserver with Master: 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:17.884079 5186 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:60179
W20250624 02:16:18.181421 5511 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:18.181972 5511 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:18.182483 5511 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:18.213716 5511 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:18.214614 5511 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:16:18.248728 5511 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:0
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=0
--tserver_master_addrs=127.31.42.254:34813
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:18.250082 5511 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:18.251817 5511 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:18.267755 5518 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:18.887758 5507 heartbeater.cc:499] Master 127.31.42.254:34813 was elected leader, sending a full tablet report...
W20250624 02:16:18.267858 5517 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:18.270390 5520 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:18.270332 5511 server_base.cc:1048] running on GCE node
I20250624 02:16:19.429247 5511 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:19.432076 5511 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:19.433512 5511 hybrid_clock.cc:648] HybridClock initialized: now 1750731379433455 us; error 76 us; skew 500 ppm
I20250624 02:16:19.434329 5511 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:19.445050 5511 webserver.cc:469] Webserver started at http://127.31.42.195:41943/ using document root <none> and password file <none>
I20250624 02:16:19.446024 5511 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:19.446224 5511 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:19.446636 5511 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:19.451110 5511 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/instance:
uuid: "1b47d4950d0647b9ab21d6f7cd081453"
format_stamp: "Formatted at 2025-06-24 02:16:19 on dist-test-slave-5k9r"
I20250624 02:16:19.452154 5511 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal/instance:
uuid: "1b47d4950d0647b9ab21d6f7cd081453"
format_stamp: "Formatted at 2025-06-24 02:16:19 on dist-test-slave-5k9r"
I20250624 02:16:19.459237 5511 fs_manager.cc:696] Time spent creating directory manager: real 0.007s user 0.003s sys 0.005s
I20250624 02:16:19.464769 5527 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:19.465914 5511 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.001s sys 0.001s
I20250624 02:16:19.466277 5511 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
uuid: "1b47d4950d0647b9ab21d6f7cd081453"
format_stamp: "Formatted at 2025-06-24 02:16:19 on dist-test-slave-5k9r"
I20250624 02:16:19.466750 5511 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:19.517401 5511 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:19.518851 5511 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:19.519248 5511 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:19.521618 5511 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:19.525527 5511 ts_tablet_manager.cc:579] Loaded tablet metadata (0 total tablets, 0 live tablets)
I20250624 02:16:19.525718 5511 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:19.525916 5511 ts_tablet_manager.cc:610] Registered 0 tablets
I20250624 02:16:19.526116 5511 ts_tablet_manager.cc:589] Time spent register tablets: real 0.000s user 0.001s sys 0.000s
I20250624 02:16:19.658850 5511 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:35639
I20250624 02:16:19.658984 5639 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:35639 every 8 connection(s)
I20250624 02:16:19.661394 5511 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb
I20250624 02:16:19.667307 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 5511
I20250624 02:16:19.667896 31915 external_mini_cluster.cc:1427] Reading /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal/instance
I20250624 02:16:19.682075 5640 heartbeater.cc:344] Connected to a master server at 127.31.42.254:34813
I20250624 02:16:19.682482 5640 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:19.683454 5640 heartbeater.cc:507] Master 127.31.42.254:34813 requested a full tablet report, sending...
I20250624 02:16:19.685729 5185 ts_manager.cc:194] Registered new tserver with Master: 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639)
I20250624 02:16:19.687570 5185 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:36013
I20250624 02:16:19.687950 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:16:19.718792 31915 test_util.cc:276] Using random seed: -369345962
I20250624 02:16:19.759445 5185 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:43268:
name: "pre_rebuild"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
W20250624 02:16:19.761899 5185 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table pre_rebuild in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 02:16:19.813036 5575 tablet_service.cc:1468] Processing CreateTablet for tablet 0cd08e8911a147fbb51dbb8d39ce6261 (DEFAULT_TABLE table=pre_rebuild [id=28b38e7126be451c8cdc0c265403b068]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:19.815321 5575 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0cd08e8911a147fbb51dbb8d39ce6261. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:19.819700 5309 tablet_service.cc:1468] Processing CreateTablet for tablet 0cd08e8911a147fbb51dbb8d39ce6261 (DEFAULT_TABLE table=pre_rebuild [id=28b38e7126be451c8cdc0c265403b068]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:19.822163 5309 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0cd08e8911a147fbb51dbb8d39ce6261. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:19.821653 5442 tablet_service.cc:1468] Processing CreateTablet for tablet 0cd08e8911a147fbb51dbb8d39ce6261 (DEFAULT_TABLE table=pre_rebuild [id=28b38e7126be451c8cdc0c265403b068]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:19.823661 5442 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 0cd08e8911a147fbb51dbb8d39ce6261. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:19.842701 5664 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Bootstrap starting.
I20250624 02:16:19.848278 5664 tablet_bootstrap.cc:654] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:19.851509 5664 log.cc:826] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:19.852944 5665 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Bootstrap starting.
I20250624 02:16:19.856122 5666 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Bootstrap starting.
I20250624 02:16:19.857395 5664 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: No bootstrap required, opened a new log
I20250624 02:16:19.858062 5664 ts_tablet_manager.cc:1397] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Time spent bootstrapping tablet: real 0.016s user 0.011s sys 0.003s
I20250624 02:16:19.860630 5665 tablet_bootstrap.cc:654] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:19.865597 5666 tablet_bootstrap.cc:654] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:19.866076 5665 log.cc:826] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:19.869601 5666 log.cc:826] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:19.877807 5665 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: No bootstrap required, opened a new log
I20250624 02:16:19.878432 5665 ts_tablet_manager.cc:1397] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Time spent bootstrapping tablet: real 0.026s user 0.014s sys 0.008s
I20250624 02:16:19.882385 5666 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: No bootstrap required, opened a new log
I20250624 02:16:19.882959 5666 ts_tablet_manager.cc:1397] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Time spent bootstrapping tablet: real 0.027s user 0.008s sys 0.011s
I20250624 02:16:19.885468 5664 raft_consensus.cc:357] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.886538 5664 raft_consensus.cc:383] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:19.886901 5664 raft_consensus.cc:738] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1b47d4950d0647b9ab21d6f7cd081453, State: Initialized, Role: FOLLOWER
I20250624 02:16:19.887960 5664 consensus_queue.cc:260] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.894701 5640 heartbeater.cc:499] Master 127.31.42.254:34813 was elected leader, sending a full tablet report...
I20250624 02:16:19.897428 5664 ts_tablet_manager.cc:1428] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Time spent starting tablet: real 0.039s user 0.026s sys 0.008s
I20250624 02:16:19.906134 5665 raft_consensus.cc:357] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.907074 5665 raft_consensus.cc:383] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:19.907399 5665 raft_consensus.cc:738] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0971d3ff884d42dc87eb43e864ed86ca, State: Initialized, Role: FOLLOWER
I20250624 02:16:19.908305 5665 consensus_queue.cc:260] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.912789 5665 ts_tablet_manager.cc:1428] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Time spent starting tablet: real 0.034s user 0.028s sys 0.004s
I20250624 02:16:19.912773 5671 raft_consensus.cc:491] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:16:19.913447 5671 raft_consensus.cc:513] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.914886 5666 raft_consensus.cc:357] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.915510 5666 raft_consensus.cc:383] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:19.915692 5666 raft_consensus.cc:738] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2b0d9569e0d84ee097265b983b992ff9, State: Initialized, Role: FOLLOWER
W20250624 02:16:19.916188 5641 tablet.cc:2378] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:16:19.917881 5666 consensus_queue.cc:260] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.920014 5671 leader_election.cc:290] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639), 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:19.923545 5666 ts_tablet_manager.cc:1428] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Time spent starting tablet: real 0.040s user 0.030s sys 0.003s
I20250624 02:16:19.933207 5595 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "0971d3ff884d42dc87eb43e864ed86ca" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1b47d4950d0647b9ab21d6f7cd081453" is_pre_election: true
I20250624 02:16:19.934069 5595 raft_consensus.cc:2466] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 0971d3ff884d42dc87eb43e864ed86ca in term 0.
I20250624 02:16:19.935432 5264 leader_election.cc:304] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0971d3ff884d42dc87eb43e864ed86ca, 1b47d4950d0647b9ab21d6f7cd081453; no voters:
I20250624 02:16:19.936159 5671 raft_consensus.cc:2802] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:16:19.936461 5671 raft_consensus.cc:491] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:16:19.936800 5671 raft_consensus.cc:3058] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:19.938513 5462 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "0971d3ff884d42dc87eb43e864ed86ca" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "2b0d9569e0d84ee097265b983b992ff9" is_pre_election: true
I20250624 02:16:19.939239 5462 raft_consensus.cc:2466] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 0971d3ff884d42dc87eb43e864ed86ca in term 0.
I20250624 02:16:19.942615 5671 raft_consensus.cc:513] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.944248 5671 leader_election.cc:290] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 1 election: Requested vote from peers 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639), 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:19.944916 5595 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "0971d3ff884d42dc87eb43e864ed86ca" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "1b47d4950d0647b9ab21d6f7cd081453"
I20250624 02:16:19.945066 5462 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "0971d3ff884d42dc87eb43e864ed86ca" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "2b0d9569e0d84ee097265b983b992ff9"
I20250624 02:16:19.945329 5595 raft_consensus.cc:3058] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:19.945494 5462 raft_consensus.cc:3058] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:19.949761 5595 raft_consensus.cc:2466] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 0971d3ff884d42dc87eb43e864ed86ca in term 1.
I20250624 02:16:19.949960 5462 raft_consensus.cc:2466] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 0971d3ff884d42dc87eb43e864ed86ca in term 1.
I20250624 02:16:19.950656 5264 leader_election.cc:304] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0971d3ff884d42dc87eb43e864ed86ca, 1b47d4950d0647b9ab21d6f7cd081453; no voters:
I20250624 02:16:19.951292 5671 raft_consensus.cc:2802] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:16:19.951753 5671 raft_consensus.cc:695] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 LEADER]: Becoming Leader. State: Replica: 0971d3ff884d42dc87eb43e864ed86ca, State: Running, Role: LEADER
I20250624 02:16:19.952430 5671 consensus_queue.cc:237] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:19.960585 5185 catalog_manager.cc:5582] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca reported cstate change: term changed from 0 to 1, leader changed from <none> to 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193). New cstate: current_term: 1 leader_uuid: "0971d3ff884d42dc87eb43e864ed86ca" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } health_report { overall_health: UNKNOWN } } }
W20250624 02:16:20.058781 5375 tablet.cc:2378] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 02:16:20.113546 5508 tablet.cc:2378] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:16:20.129554 5595 raft_consensus.cc:1273] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Refusing update from remote peer 0971d3ff884d42dc87eb43e864ed86ca: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 02:16:20.129611 5462 raft_consensus.cc:1273] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Refusing update from remote peer 0971d3ff884d42dc87eb43e864ed86ca: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 02:16:20.131245 5672 consensus_queue.cc:1035] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [LEADER]: Connected to new peer: Peer: permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:16:20.131902 5671 consensus_queue.cc:1035] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [LEADER]: Connected to new peer: Peer: permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:16:20.174841 5684 mvcc.cc:204] Tried to move back new op lower bound from 7170995732994527232 to 7170995732292386816. Current Snapshot: MvccSnapshot[applied={T|T < 7170995732994527232}]
I20250624 02:16:25.268579 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 5152
W20250624 02:16:25.613291 5718 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:25.613880 5718 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:25.647403 5718 flags.cc:425] Enabled experimental flag: --enable_multi_tenancy=false
W20250624 02:16:26.243716 5374 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:34813 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:34813: connect: Connection refused (error 111)
W20250624 02:16:26.256273 5507 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:34813 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:34813: connect: Connection refused (error 111)
W20250624 02:16:26.263674 5640 heartbeater.cc:646] Failed to heartbeat to 127.31.42.254:34813 (0 consecutive failures): Network error: Failed to send heartbeat to master: Client connection negotiation failed: client connection to 127.31.42.254:34813: connect: Connection refused (error 111)
W20250624 02:16:27.104665 5723 debug-util.cc:398] Leaking SignalData structure 0x7b0800036040 after lost signal to thread 5718
W20250624 02:16:27.105233 5723 kernel_stack_watchdog.cc:198] Thread 5718 stuck at /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/util/thread.cc:641 for 400ms:
Kernel stack:
(could not read kernel stack)
User stack:
<Timed out: thread did not respond: maybe it is blocking signals>
W20250624 02:16:27.241128 5718 thread.cc:640] rpc reactor (reactor) Time spent creating pthread: real 1.538s user 0.522s sys 0.992s
W20250624 02:16:27.360853 5718 thread.cc:606] rpc reactor (reactor) Time spent starting thread: real 1.658s user 0.524s sys 1.002s
I20250624 02:16:27.434938 5718 minidump.cc:252] Setting minidump size limit to 20M
I20250624 02:16:27.437361 5718 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:27.438508 5718 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:27.449422 5752 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:27.450345 5753 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:27.452016 5755 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:27.452272 5718 server_base.cc:1048] running on GCE node
I20250624 02:16:27.453394 5718 hybrid_clock.cc:584] initializing the hybrid clock with 'system' time source
I20250624 02:16:27.453806 5718 hybrid_clock.cc:648] HybridClock initialized: now 1750731387453782 us; error 307004 us; skew 500 ppm
I20250624 02:16:27.454531 5718 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:27.458763 5718 webserver.cc:469] Webserver started at http://0.0.0.0:36885/ using document root <none> and password file <none>
I20250624 02:16:27.459564 5718 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:27.459790 5718 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:27.460219 5718 server_base.cc:896] This appears to be a new deployment of Kudu; creating new FS layout
I20250624 02:16:27.464417 5718 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/instance:
uuid: "7d986e92960341c6a16aa83dde4637c6"
format_stamp: "Formatted at 2025-06-24 02:16:27 on dist-test-slave-5k9r"
I20250624 02:16:27.465495 5718 fs_manager.cc:1068] Generated new instance metadata in path /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal/instance:
uuid: "7d986e92960341c6a16aa83dde4637c6"
format_stamp: "Formatted at 2025-06-24 02:16:27 on dist-test-slave-5k9r"
I20250624 02:16:27.471640 5718 fs_manager.cc:696] Time spent creating directory manager: real 0.006s user 0.000s sys 0.004s
I20250624 02:16:27.476970 5760 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:27.477856 5718 fs_manager.cc:730] Time spent opening block manager: real 0.003s user 0.000s sys 0.002s
I20250624 02:16:27.478227 5718 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
uuid: "7d986e92960341c6a16aa83dde4637c6"
format_stamp: "Formatted at 2025-06-24 02:16:27 on dist-test-slave-5k9r"
I20250624 02:16:27.478550 5718 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:27.539270 5718 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:27.540673 5718 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:27.541100 5718 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:27.545686 5718 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet 00000000000000000000000000000000. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:27.560068 5718 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Bootstrap starting.
I20250624 02:16:27.564980 5718 tablet_bootstrap.cc:654] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:27.566649 5718 log.cc:826] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:27.570792 5718 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: No bootstrap required, opened a new log
I20250624 02:16:27.587535 5718 raft_consensus.cc:357] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER }
I20250624 02:16:27.588032 5718 raft_consensus.cc:383] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:27.588294 5718 raft_consensus.cc:738] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d986e92960341c6a16aa83dde4637c6, State: Initialized, Role: FOLLOWER
I20250624 02:16:27.588960 5718 consensus_queue.cc:260] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER }
I20250624 02:16:27.589435 5718 raft_consensus.cc:397] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 0 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:16:27.589699 5718 raft_consensus.cc:491] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 0 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:16:27.590003 5718 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:27.594038 5718 raft_consensus.cc:513] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER }
I20250624 02:16:27.594686 5718 leader_election.cc:304] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 7d986e92960341c6a16aa83dde4637c6; no voters:
I20250624 02:16:27.596357 5718 leader_election.cc:290] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [CANDIDATE]: Term 1 election: Requested vote from peers
I20250624 02:16:27.596585 5767 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:16:27.599339 5767 raft_consensus.cc:695] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 LEADER]: Becoming Leader. State: Replica: 7d986e92960341c6a16aa83dde4637c6, State: Running, Role: LEADER
I20250624 02:16:27.600252 5767 consensus_queue.cc:237] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER }
I20250624 02:16:27.607422 5768 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 7d986e92960341c6a16aa83dde4637c6. Latest consensus state: current_term: 1 leader_uuid: "7d986e92960341c6a16aa83dde4637c6" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER } }
I20250624 02:16:27.607887 5768 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:27.610726 5769 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 1 leader_uuid: "7d986e92960341c6a16aa83dde4637c6" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER } }
I20250624 02:16:27.611297 5769 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:27.619889 5718 tablet_replica.cc:331] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: stopping tablet replica
I20250624 02:16:27.620507 5718 raft_consensus.cc:2241] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 LEADER]: Raft consensus shutting down.
I20250624 02:16:27.620944 5718 raft_consensus.cc:2270] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Raft consensus is shut down!
I20250624 02:16:27.623052 5718 master.cc:561] Master@0.0.0.0:7051 shutting down...
W20250624 02:16:27.623543 5718 acceptor_pool.cc:196] Could not shut down acceptor socket on 0.0.0.0:7051: Network error: shutdown error: Transport endpoint is not connected (error 107)
I20250624 02:16:27.678090 5718 master.cc:583] Master@0.0.0.0:7051 shutdown complete.
I20250624 02:16:28.713377 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 5245
I20250624 02:16:28.758661 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 5378
I20250624 02:16:28.796715 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 5511
I20250624 02:16:28.833433 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
master
run
--ipki_ca_key_size=768
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:34813
--webserver_interface=127.31.42.254
--webserver_port=36625
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--rpc_reuseport=true
--master_addresses=127.31.42.254:34813 with env {}
W20250624 02:16:29.142556 5777 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:29.143148 5777 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:29.143605 5777 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:29.174806 5777 flags.cc:425] Enabled experimental flag: --ipki_ca_key_size=768
W20250624 02:16:29.175140 5777 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:29.175388 5777 flags.cc:425] Enabled experimental flag: --tsk_num_rsa_bits=512
W20250624 02:16:29.175626 5777 flags.cc:425] Enabled experimental flag: --rpc_reuseport=true
I20250624 02:16:29.211419 5777 master_runner.cc:386] Master server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
--ipki_ca_key_size=768
--master_addresses=127.31.42.254:34813
--ipki_server_key_size=768
--openssl_security_level_override=0
--tsk_num_rsa_bits=512
--rpc_bind_addresses=127.31.42.254:34813
--rpc_reuseport=true
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb
--webserver_interface=127.31.42.254
--webserver_port=36625
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/logs
--logbuflevel=-1
--logtostderr=true
Master server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:29.212723 5777 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:29.214435 5777 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:29.227963 5784 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:29.227991 5783 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:30.402530 5786 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:30.404507 5785 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1177 milliseconds
I20250624 02:16:30.404584 5777 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:16:30.405790 5777 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:30.408428 5777 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:30.409778 5777 hybrid_clock.cc:648] HybridClock initialized: now 1750731390409748 us; error 37 us; skew 500 ppm
I20250624 02:16:30.410619 5777 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:30.434625 5777 webserver.cc:469] Webserver started at http://127.31.42.254:36625/ using document root <none> and password file <none>
I20250624 02:16:30.435626 5777 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:30.435883 5777 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:30.443921 5777 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.005s sys 0.000s
I20250624 02:16:30.448589 5793 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:30.449643 5777 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.001s sys 0.001s
I20250624 02:16:30.450013 5777 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
uuid: "7d986e92960341c6a16aa83dde4637c6"
format_stamp: "Formatted at 2025-06-24 02:16:27 on dist-test-slave-5k9r"
I20250624 02:16:30.451972 5777 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:30.509827 5777 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:30.511415 5777 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:30.511863 5777 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:30.583662 5777 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.254:34813
I20250624 02:16:30.583748 5844 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.254:34813 every 8 connection(s)
I20250624 02:16:30.586453 5777 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb
I20250624 02:16:30.587845 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 5777
I20250624 02:16:30.589377 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.193:39353
--local_ip_for_outbound_sockets=127.31.42.193
--tserver_master_addrs=127.31.42.254:34813
--webserver_port=39883
--webserver_interface=127.31.42.193
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin with env {}
I20250624 02:16:30.597210 5845 sys_catalog.cc:263] Verifying existing consensus state
I20250624 02:16:30.609311 5845 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Bootstrap starting.
I20250624 02:16:30.619207 5845 log.cc:826] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:30.630934 5845 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Bootstrap replayed 1/1 log segments. Stats: ops{read=2 overwritten=0 applied=2 ignored=0} inserts{seen=2 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:16:30.631714 5845 tablet_bootstrap.cc:492] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Bootstrap complete.
I20250624 02:16:30.651970 5845 raft_consensus.cc:357] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:30.652640 5845 raft_consensus.cc:738] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 7d986e92960341c6a16aa83dde4637c6, State: Initialized, Role: FOLLOWER
I20250624 02:16:30.653337 5845 consensus_queue.cc:260] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 2, Last appended: 1.2, Last appended by leader: 2, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:30.653828 5845 raft_consensus.cc:397] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Only one voter in the Raft config. Triggering election immediately
I20250624 02:16:30.654162 5845 raft_consensus.cc:491] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Starting leader election (initial election of a single-replica configuration)
I20250624 02:16:30.654492 5845 raft_consensus.cc:3058] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:16:30.658573 5845 raft_consensus.cc:513] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:30.659268 5845 leader_election.cc:304] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 1 responses out of 1 voters: 1 yes votes; 0 no votes. yes voters: 7d986e92960341c6a16aa83dde4637c6; no voters:
I20250624 02:16:30.661463 5845 leader_election.cc:290] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [CANDIDATE]: Term 2 election: Requested vote from peers
I20250624 02:16:30.661876 5849 raft_consensus.cc:2802] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:16:30.665206 5849 raft_consensus.cc:695] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [term 2 LEADER]: Becoming Leader. State: Replica: 7d986e92960341c6a16aa83dde4637c6, State: Running, Role: LEADER
I20250624 02:16:30.666076 5849 consensus_queue.cc:237] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 2, Committed index: 2, Last appended: 1.2, Last appended by leader: 2, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } }
I20250624 02:16:30.666853 5845 sys_catalog.cc:564] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: configured and running, proceeding with master startup.
I20250624 02:16:30.676921 5850 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: SysCatalogTable state changed. Reason: RaftConsensus started. Latest consensus state: current_term: 2 leader_uuid: "7d986e92960341c6a16aa83dde4637c6" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } } }
I20250624 02:16:30.679359 5850 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:30.678560 5851 sys_catalog.cc:455] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: SysCatalogTable state changed. Reason: New leader 7d986e92960341c6a16aa83dde4637c6. Latest consensus state: current_term: 2 leader_uuid: "7d986e92960341c6a16aa83dde4637c6" committed_config { opid_index: -1 OBSOLETE_local: true peers { permanent_uuid: "7d986e92960341c6a16aa83dde4637c6" member_type: VOTER last_known_addr { host: "127.31.42.254" port: 34813 } } }
I20250624 02:16:30.680434 5851 sys_catalog.cc:458] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6 [sys.catalog]: This master's current role is: LEADER
I20250624 02:16:30.683579 5856 catalog_manager.cc:1477] Loading table and tablet metadata into memory...
I20250624 02:16:30.697768 5856 catalog_manager.cc:671] Loaded metadata for table pre_rebuild [id=3e796535e3dc4e5294e87050d8c55a7c]
I20250624 02:16:30.707175 5856 tablet_loader.cc:96] loaded metadata for tablet 0cd08e8911a147fbb51dbb8d39ce6261 (table pre_rebuild [id=3e796535e3dc4e5294e87050d8c55a7c])
I20250624 02:16:30.709889 5856 catalog_manager.cc:1486] Initializing Kudu cluster ID...
I20250624 02:16:30.744747 5856 catalog_manager.cc:1349] Generated new cluster ID: 0488158170514eabb027d53fd41f710c
I20250624 02:16:30.745138 5856 catalog_manager.cc:1497] Initializing Kudu internal certificate authority...
I20250624 02:16:30.789551 5867 catalog_manager.cc:797] Waiting for catalog manager background task thread to start: Service unavailable: Catalog manager is not initialized. State: Starting
I20250624 02:16:30.817143 5856 catalog_manager.cc:1372] Generated new certificate authority record
I20250624 02:16:30.818697 5856 catalog_manager.cc:1506] Loading token signing keys...
I20250624 02:16:30.831274 5856 catalog_manager.cc:5955] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Generated new TSK 0
I20250624 02:16:30.832150 5856 catalog_manager.cc:1516] Initializing in-progress tserver states...
W20250624 02:16:30.974013 5847 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:30.974529 5847 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:30.975029 5847 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:31.006911 5847 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:31.007778 5847 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.193
I20250624 02:16:31.042475 5847 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.193:39353
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb
--webserver_interface=127.31.42.193
--webserver_port=39883
--tserver_master_addrs=127.31.42.254:34813
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.193
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:31.043754 5847 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:31.045327 5847 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:31.062744 5874 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:31.062870 5873 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:31.065193 5876 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:32.247372 5875 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Connection timed out after 1185 milliseconds
I20250624 02:16:32.247411 5847 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:16:32.251827 5847 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:32.260300 5847 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:32.261873 5847 hybrid_clock.cc:648] HybridClock initialized: now 1750731392261804 us; error 87 us; skew 500 ppm
I20250624 02:16:32.262737 5847 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:32.270047 5847 webserver.cc:469] Webserver started at http://127.31.42.193:39883/ using document root <none> and password file <none>
I20250624 02:16:32.271005 5847 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:32.271246 5847 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:32.279914 5847 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.001s sys 0.004s
I20250624 02:16:32.284902 5883 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:32.286046 5847 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.003s sys 0.000s
I20250624 02:16:32.286370 5847 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
uuid: "0971d3ff884d42dc87eb43e864ed86ca"
format_stamp: "Formatted at 2025-06-24 02:16:15 on dist-test-slave-5k9r"
I20250624 02:16:32.288455 5847 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:32.351457 5847 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:32.352959 5847 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:32.353389 5847 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:32.356019 5847 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:32.362414 5890 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20250624 02:16:32.373405 5847 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250624 02:16:32.373677 5847 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.013s user 0.000s sys 0.002s
I20250624 02:16:32.374094 5847 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250624 02:16:32.378855 5847 ts_tablet_manager.cc:610] Registered 1 tablets
I20250624 02:16:32.379050 5847 ts_tablet_manager.cc:589] Time spent register tablets: real 0.005s user 0.004s sys 0.000s
I20250624 02:16:32.379473 5890 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Bootstrap starting.
I20250624 02:16:32.567955 5847 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.193:39353
I20250624 02:16:32.568202 5996 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.193:39353 every 8 connection(s)
I20250624 02:16:32.570796 5847 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb
I20250624 02:16:32.581223 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 5847
I20250624 02:16:32.583199 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.194:40181
--local_ip_for_outbound_sockets=127.31.42.194
--tserver_master_addrs=127.31.42.254:34813
--webserver_port=33751
--webserver_interface=127.31.42.194
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin with env {}
I20250624 02:16:32.602851 5997 heartbeater.cc:344] Connected to a master server at 127.31.42.254:34813
I20250624 02:16:32.603367 5997 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:32.604614 5997 heartbeater.cc:507] Master 127.31.42.254:34813 requested a full tablet report, sending...
I20250624 02:16:32.609282 5810 ts_manager.cc:194] Registered new tserver with Master: 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353)
I20250624 02:16:32.617990 5810 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.193:56569
I20250624 02:16:32.644292 5890 log.cc:826] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Log is configured to *not* fsync() on all Append() calls
W20250624 02:16:32.900667 6001 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:32.901171 6001 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:32.901660 6001 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:32.932927 6001 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:32.933773 6001 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.194
I20250624 02:16:32.969025 6001 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.194:40181
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb
--webserver_interface=127.31.42.194
--webserver_port=33751
--tserver_master_addrs=127.31.42.254:34813
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.194
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:32.970340 6001 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:32.971951 6001 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:32.990525 6010 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:33.623464 5997 heartbeater.cc:499] Master 127.31.42.254:34813 was elected leader, sending a full tablet report...
W20250624 02:16:34.388074 6008 debug-util.cc:398] Leaking SignalData structure 0x7b0800000a80 after lost signal to thread 6001
W20250624 02:16:32.990613 6009 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:34.470009 6001 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.483s user 0.466s sys 0.986s
W20250624 02:16:34.471849 6011 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1480 milliseconds
W20250624 02:16:34.472364 6001 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.486s user 0.467s sys 0.986s
W20250624 02:16:34.472304 6012 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:34.472808 6001 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:16:34.477166 6001 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:34.479374 6001 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:34.480713 6001 hybrid_clock.cc:648] HybridClock initialized: now 1750731394480616 us; error 105 us; skew 500 ppm
I20250624 02:16:34.481490 6001 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:34.488168 6001 webserver.cc:469] Webserver started at http://127.31.42.194:33751/ using document root <none> and password file <none>
I20250624 02:16:34.489122 6001 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:34.489354 6001 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:34.497465 6001 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.006s sys 0.000s
I20250624 02:16:34.503778 6019 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:34.504871 6001 fs_manager.cc:730] Time spent opening block manager: real 0.005s user 0.003s sys 0.002s
I20250624 02:16:34.505183 6001 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
uuid: "2b0d9569e0d84ee097265b983b992ff9"
format_stamp: "Formatted at 2025-06-24 02:16:17 on dist-test-slave-5k9r"
I20250624 02:16:34.507110 6001 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:34.565358 6001 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:34.566828 6001 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:34.567258 6001 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:34.570405 6001 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:34.577082 6026 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20250624 02:16:34.585798 6001 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250624 02:16:34.586133 6001 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.011s user 0.001s sys 0.001s
I20250624 02:16:34.586479 6001 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250624 02:16:34.593562 6001 ts_tablet_manager.cc:610] Registered 1 tablets
I20250624 02:16:34.593842 6001 ts_tablet_manager.cc:589] Time spent register tablets: real 0.007s user 0.007s sys 0.000s
I20250624 02:16:34.594251 6026 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Bootstrap starting.
I20250624 02:16:34.795956 6001 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.194:40181
I20250624 02:16:34.796100 6132 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.194:40181 every 8 connection(s)
I20250624 02:16:34.799614 6001 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb
I20250624 02:16:34.809033 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 6001
I20250624 02:16:34.811306 31915 external_mini_cluster.cc:1351] Running /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
/tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data
--block_manager=log
--webserver_interface=localhost
--never_fsync
--enable_minidumps=false
--redact=none
--metrics_log_interval_ms=1000
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/logs
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb
--server_dump_info_format=pb
--rpc_server_allow_ephemeral_ports
--unlock_experimental_flags
--unlock_unsafe_flags
--logtostderr
--logbuflevel=-1
--ipki_server_key_size=768
--openssl_security_level_override=0
tserver
run
--rpc_bind_addresses=127.31.42.195:35639
--local_ip_for_outbound_sockets=127.31.42.195
--tserver_master_addrs=127.31.42.254:34813
--webserver_port=41943
--webserver_interface=127.31.42.195
--builtin_ntp_servers=127.31.42.212:39925
--builtin_ntp_poll_interval_ms=100
--ntp_initial_sync_wait_secs=10
--time_source=builtin with env {}
I20250624 02:16:34.834208 6133 heartbeater.cc:344] Connected to a master server at 127.31.42.254:34813
I20250624 02:16:34.834673 6133 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:34.835884 6133 heartbeater.cc:507] Master 127.31.42.254:34813 requested a full tablet report, sending...
I20250624 02:16:34.839926 5810 ts_manager.cc:194] Registered new tserver with Master: 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:34.843693 5810 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.194:41207
I20250624 02:16:34.931252 6026 log.cc:826] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Log is configured to *not* fsync() on all Append() calls
W20250624 02:16:35.129405 6137 flags.cc:425] Enabled unsafe flag: --openssl_security_level_override=0
W20250624 02:16:35.129922 6137 flags.cc:425] Enabled unsafe flag: --rpc_server_allow_ephemeral_ports=true
W20250624 02:16:35.130468 6137 flags.cc:425] Enabled unsafe flag: --never_fsync=true
W20250624 02:16:35.161796 6137 flags.cc:425] Enabled experimental flag: --ipki_server_key_size=768
W20250624 02:16:35.162678 6137 flags.cc:425] Enabled experimental flag: --local_ip_for_outbound_sockets=127.31.42.195
I20250624 02:16:35.198096 6137 tablet_server_runner.cc:78] Tablet server non-default flags:
--builtin_ntp_poll_interval_ms=100
--builtin_ntp_servers=127.31.42.212:39925
--ntp_initial_sync_wait_secs=10
--time_source=builtin
--fs_data_dirs=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data
--fs_wal_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
--ipki_server_key_size=768
--openssl_security_level_override=0
--rpc_bind_addresses=127.31.42.195:35639
--rpc_server_allow_ephemeral_ports=true
--metrics_log_interval_ms=1000
--server_dump_info_format=pb
--server_dump_info_path=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb
--webserver_interface=127.31.42.195
--webserver_port=41943
--tserver_master_addrs=127.31.42.254:34813
--never_fsync=true
--redact=none
--unlock_experimental_flags=true
--unlock_unsafe_flags=true
--enable_minidumps=false
--local_ip_for_outbound_sockets=127.31.42.195
--log_dir=/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/logs
--logbuflevel=-1
--logtostderr=true
Tablet server version:
kudu 1.18.0-SNAPSHOT
revision 6d72d4a242076468501f3430b9a2cd050c634be2
build type FASTDEBUG
built by None at 24 Jun 2025 02:07:26 UTC on 24a791456cd2
build id 6741
TSAN enabled
I20250624 02:16:35.199363 6137 env_posix.cc:2264] Not raising this process' open files per process limit of 1048576; it is already as high as it can go
I20250624 02:16:35.201016 6137 file_cache.cc:492] Constructed file cache file cache with capacity 419430
W20250624 02:16:35.220150 6144 instance_detector.cc:116] could not retrieve AWS instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
I20250624 02:16:35.847877 6133 heartbeater.cc:499] Master 127.31.42.254:34813 was elected leader, sending a full tablet report...
I20250624 02:16:36.203372 5890 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Bootstrap replayed 1/1 log segments. Stats: ops{read=205 overwritten=0 applied=205 ignored=0} inserts{seen=10200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:16:36.204818 5890 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Bootstrap complete.
I20250624 02:16:36.207247 5890 ts_tablet_manager.cc:1397] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Time spent bootstrapping tablet: real 3.828s user 3.640s sys 0.076s
I20250624 02:16:36.230414 5890 raft_consensus.cc:357] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:36.234787 5890 raft_consensus.cc:738] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0971d3ff884d42dc87eb43e864ed86ca, State: Initialized, Role: FOLLOWER
I20250624 02:16:36.236344 5890 consensus_queue.cc:260] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 205, Last appended: 1.205, Last appended by leader: 205, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:36.252257 5890 ts_tablet_manager.cc:1428] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Time spent starting tablet: real 0.045s user 0.034s sys 0.008s
W20250624 02:16:36.614720 6143 debug-util.cc:398] Leaking SignalData structure 0x7b0800000a80 after lost signal to thread 6137
W20250624 02:16:35.222028 6145 instance_detector.cc:116] could not retrieve Azure instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:37.204526 6137 thread.cc:640] OpenStack (cloud detector) Time spent creating pthread: real 1.983s user 0.519s sys 1.136s
W20250624 02:16:37.207027 6137 thread.cc:606] OpenStack (cloud detector) Time spent starting thread: real 1.985s user 0.520s sys 1.136s
W20250624 02:16:37.207574 6147 instance_detector.cc:116] could not retrieve OpenStack instance metadata: Network error: curl error: HTTP response code said error: The requested URL returned error: 404
W20250624 02:16:37.209879 6146 instance_detector.cc:116] could not retrieve GCE instance metadata: Timed out: curl timeout: Timeout was reached: Resolving timed out after 1984 milliseconds
I20250624 02:16:37.209964 6137 server_base.cc:1043] Not found: could not retrieve instance metadata: unable to detect cloud type of this node, probably running in non-cloud environment
I20250624 02:16:37.211314 6137 hybrid_clock.cc:584] initializing the hybrid clock with 'builtin' time source
I20250624 02:16:37.213510 6137 hybrid_clock.cc:630] waiting up to --ntp_initial_sync_wait_secs=10 seconds for the clock to synchronize
I20250624 02:16:37.214848 6137 hybrid_clock.cc:648] HybridClock initialized: now 1750731397214813 us; error 38 us; skew 500 ppm
I20250624 02:16:37.215643 6137 server_base.cc:848] Flag tcmalloc_max_total_thread_cache_bytes is not working since tcmalloc is not enabled.
I20250624 02:16:37.221807 6137 webserver.cc:469] Webserver started at http://127.31.42.195:41943/ using document root <none> and password file <none>
I20250624 02:16:37.222893 6137 fs_manager.cc:362] Metadata directory not provided
I20250624 02:16:37.223119 6137 fs_manager.cc:368] Using write-ahead log directory (fs_wal_dir) as metadata directory
I20250624 02:16:37.230995 6137 fs_manager.cc:714] Time spent opening directory manager: real 0.005s user 0.002s sys 0.001s
I20250624 02:16:37.235951 6155 log_block_manager.cc:3788] Time spent loading block containers with low live blocks: real 0.000s user 0.000s sys 0.000s
I20250624 02:16:37.237156 6137 fs_manager.cc:730] Time spent opening block manager: real 0.004s user 0.004s sys 0.000s
I20250624 02:16:37.237490 6137 fs_manager.cc:647] Opened local filesystem: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data,/tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
uuid: "1b47d4950d0647b9ab21d6f7cd081453"
format_stamp: "Formatted at 2025-06-24 02:16:19 on dist-test-slave-5k9r"
I20250624 02:16:37.239439 6137 fs_report.cc:389] FS layout report
--------------------
wal directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
metadata directory: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal
1 data directories: /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/data
Total live blocks: 0
Total live bytes: 0
Total live bytes (after alignment): 0
Total number of LBM containers: 0 (0 full)
Did not check for missing blocks
Did not check for orphaned blocks
Total full LBM containers with extra space: 0 (0 repaired)
Total full LBM container extra space in bytes: 0 (0 repaired)
Total incomplete LBM containers: 0 (0 repaired)
Total LBM partial records: 0 (0 repaired)
Total corrupted LBM metadata records in RocksDB: 0 (0 repaired)
I20250624 02:16:37.301380 6137 rpc_server.cc:225] running with OpenSSL 1.1.1 11 Sep 2018
I20250624 02:16:37.302897 6137 env_posix.cc:2264] Not raising this process' running threads per effective uid limit of 18446744073709551615; it is already as high as it can go
I20250624 02:16:37.303333 6137 kserver.cc:163] Server-wide thread pool size limit: 3276
I20250624 02:16:37.306056 6137 txn_system_client.cc:432] TxnSystemClient initialization is disabled...
I20250624 02:16:37.312181 6162 ts_tablet_manager.cc:542] Loading tablet metadata (0/1 complete)
I20250624 02:16:37.323289 6137 ts_tablet_manager.cc:579] Loaded tablet metadata (1 total tablets, 1 live tablets)
I20250624 02:16:37.323539 6137 ts_tablet_manager.cc:525] Time spent load tablet metadata: real 0.013s user 0.002s sys 0.000s
I20250624 02:16:37.323805 6137 ts_tablet_manager.cc:594] Registering tablets (0/1 complete)
I20250624 02:16:37.328511 6137 ts_tablet_manager.cc:610] Registered 1 tablets
I20250624 02:16:37.328712 6137 ts_tablet_manager.cc:589] Time spent register tablets: real 0.005s user 0.001s sys 0.002s
I20250624 02:16:37.329182 6162 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Bootstrap starting.
I20250624 02:16:37.537125 6137 rpc_server.cc:307] RPC server started. Bound to: 127.31.42.195:35639
I20250624 02:16:37.537339 6268 acceptor_pool.cc:272] collecting diagnostics on the listening RPC socket 127.31.42.195:35639 every 8 connection(s)
I20250624 02:16:37.540217 6137 server_base.cc:1180] Dumped server information to /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb
I20250624 02:16:37.549687 31915 external_mini_cluster.cc:1413] Started /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu as pid 6137
I20250624 02:16:37.583647 6269 heartbeater.cc:344] Connected to a master server at 127.31.42.254:34813
I20250624 02:16:37.584165 6269 heartbeater.cc:461] Registering TS with master...
I20250624 02:16:37.585446 6269 heartbeater.cc:507] Master 127.31.42.254:34813 requested a full tablet report, sending...
I20250624 02:16:37.589520 5810 ts_manager.cc:194] Registered new tserver with Master: 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639)
I20250624 02:16:37.592559 5810 master_service.cc:496] Signed X509 certificate for tserver {username='slave'} at 127.31.42.195:56865
I20250624 02:16:37.601603 31915 external_mini_cluster.cc:934] 3 TS(s) registered with all masters
I20250624 02:16:37.643337 6162 log.cc:826] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Log is configured to *not* fsync() on all Append() calls
I20250624 02:16:37.837913 6281 raft_consensus.cc:491] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:16:37.838503 6281 raft_consensus.cc:513] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:37.841504 6281 leader_election.cc:290] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639), 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:37.872128 6224 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "0971d3ff884d42dc87eb43e864ed86ca" candidate_term: 2 candidate_status { last_received { term: 1 index: 205 } } ignore_live_leader: false dest_uuid: "1b47d4950d0647b9ab21d6f7cd081453" is_pre_election: true
W20250624 02:16:37.885812 5886 leader_election.cc:343] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 2 pre-election: Tablet error from VoteRequest() call to peer 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639): Illegal state: must be running to vote when last-logged opid is not known
I20250624 02:16:37.884167 6088 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "0971d3ff884d42dc87eb43e864ed86ca" candidate_term: 2 candidate_status { last_received { term: 1 index: 205 } } ignore_live_leader: false dest_uuid: "2b0d9569e0d84ee097265b983b992ff9" is_pre_election: true
W20250624 02:16:37.894178 5886 leader_election.cc:343] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 2 pre-election: Tablet error from VoteRequest() call to peer 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181): Illegal state: must be running to vote when last-logged opid is not known
I20250624 02:16:37.894531 5886 leader_election.cc:304] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate lost. Election summary: received 3 responses out of 3 voters: 1 yes votes; 2 no votes. yes voters: 0971d3ff884d42dc87eb43e864ed86ca; no voters: 1b47d4950d0647b9ab21d6f7cd081453, 2b0d9569e0d84ee097265b983b992ff9
I20250624 02:16:37.895222 6281 raft_consensus.cc:2747] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Leader pre-election lost for term 2. Reason: could not achieve majority
I20250624 02:16:38.265933 6026 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Bootstrap replayed 1/1 log segments. Stats: ops{read=205 overwritten=0 applied=205 ignored=0} inserts{seen=10200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:16:38.266767 6026 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Bootstrap complete.
I20250624 02:16:38.268133 6026 ts_tablet_manager.cc:1397] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Time spent bootstrapping tablet: real 3.674s user 3.438s sys 0.076s
I20250624 02:16:38.274397 6026 raft_consensus.cc:357] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:38.276991 6026 raft_consensus.cc:738] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2b0d9569e0d84ee097265b983b992ff9, State: Initialized, Role: FOLLOWER
I20250624 02:16:38.277855 6026 consensus_queue.cc:260] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 205, Last appended: 1.205, Last appended by leader: 205, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:38.282167 6026 ts_tablet_manager.cc:1428] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Time spent starting tablet: real 0.014s user 0.011s sys 0.004s
I20250624 02:16:38.596499 6269 heartbeater.cc:499] Master 127.31.42.254:34813 was elected leader, sending a full tablet report...
I20250624 02:16:39.466548 6288 raft_consensus.cc:491] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:16:39.467018 6288 raft_consensus.cc:513] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:39.475549 6288 leader_election.cc:290] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [CANDIDATE]: Term 2 pre-election: Requested pre-vote from peers 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639), 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353)
I20250624 02:16:39.494436 6224 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "2b0d9569e0d84ee097265b983b992ff9" candidate_term: 2 candidate_status { last_received { term: 1 index: 205 } } ignore_live_leader: false dest_uuid: "1b47d4950d0647b9ab21d6f7cd081453" is_pre_election: true
W20250624 02:16:39.496024 6022 leader_election.cc:343] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [CANDIDATE]: Term 2 pre-election: Tablet error from VoteRequest() call to peer 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639): Illegal state: must be running to vote when last-logged opid is not known
I20250624 02:16:39.498040 5952 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "2b0d9569e0d84ee097265b983b992ff9" candidate_term: 2 candidate_status { last_received { term: 1 index: 205 } } ignore_live_leader: false dest_uuid: "0971d3ff884d42dc87eb43e864ed86ca" is_pre_election: true
I20250624 02:16:39.498682 5952 raft_consensus.cc:2466] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 2b0d9569e0d84ee097265b983b992ff9 in term 1.
I20250624 02:16:39.500022 6023 leader_election.cc:304] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [CANDIDATE]: Term 2 pre-election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 0971d3ff884d42dc87eb43e864ed86ca, 2b0d9569e0d84ee097265b983b992ff9; no voters: 1b47d4950d0647b9ab21d6f7cd081453
I20250624 02:16:39.500895 6288 raft_consensus.cc:2802] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Leader pre-election won for term 2
I20250624 02:16:39.501185 6288 raft_consensus.cc:491] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:16:39.501433 6288 raft_consensus.cc:3058] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:16:39.507784 6288 raft_consensus.cc:513] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:39.509263 6288 leader_election.cc:290] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [CANDIDATE]: Term 2 election: Requested vote from peers 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639), 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353)
I20250624 02:16:39.510042 6224 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "2b0d9569e0d84ee097265b983b992ff9" candidate_term: 2 candidate_status { last_received { term: 1 index: 205 } } ignore_live_leader: false dest_uuid: "1b47d4950d0647b9ab21d6f7cd081453"
I20250624 02:16:39.510334 5952 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "0cd08e8911a147fbb51dbb8d39ce6261" candidate_uuid: "2b0d9569e0d84ee097265b983b992ff9" candidate_term: 2 candidate_status { last_received { term: 1 index: 205 } } ignore_live_leader: false dest_uuid: "0971d3ff884d42dc87eb43e864ed86ca"
I20250624 02:16:39.510826 5952 raft_consensus.cc:3058] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Advancing to term 2
W20250624 02:16:39.511144 6022 leader_election.cc:343] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [CANDIDATE]: Term 2 election: Tablet error from VoteRequest() call to peer 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639): Illegal state: must be running to vote when last-logged opid is not known
I20250624 02:16:39.517103 5952 raft_consensus.cc:2466] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 2 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 2b0d9569e0d84ee097265b983b992ff9 in term 2.
I20250624 02:16:39.518411 6023 leader_election.cc:304] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [CANDIDATE]: Term 2 election: Election decided. Result: candidate won. Election summary: received 3 responses out of 3 voters: 2 yes votes; 1 no votes. yes voters: 0971d3ff884d42dc87eb43e864ed86ca, 2b0d9569e0d84ee097265b983b992ff9; no voters: 1b47d4950d0647b9ab21d6f7cd081453
I20250624 02:16:39.519292 6288 raft_consensus.cc:2802] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 FOLLOWER]: Leader election won for term 2
I20250624 02:16:39.521226 6288 raft_consensus.cc:695] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 LEADER]: Becoming Leader. State: Replica: 2b0d9569e0d84ee097265b983b992ff9, State: Running, Role: LEADER
I20250624 02:16:39.522115 6288 consensus_queue.cc:237] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 205, Committed index: 205, Last appended: 1.205, Last appended by leader: 205, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:39.535665 5810 catalog_manager.cc:5582] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 reported cstate change: term changed from 0 to 2, leader changed from <none> to 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194), VOTER 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193) added, VOTER 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195) added, VOTER 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194) added. New cstate: current_term: 2 leader_uuid: "2b0d9569e0d84ee097265b983b992ff9" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } health_report { overall_health: HEALTHY } } }
I20250624 02:16:39.626358 6162 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Bootstrap replayed 1/1 log segments. Stats: ops{read=205 overwritten=0 applied=205 ignored=0} inserts{seen=10200 ignored=0} mutations{seen=0 ignored=0} orphaned_commits=0. Pending: 0 replicates
I20250624 02:16:39.627377 6162 tablet_bootstrap.cc:492] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Bootstrap complete.
I20250624 02:16:39.629119 6162 ts_tablet_manager.cc:1397] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Time spent bootstrapping tablet: real 2.300s user 2.198s sys 0.072s
I20250624 02:16:39.636345 6162 raft_consensus.cc:357] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:39.638309 6162 raft_consensus.cc:738] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1b47d4950d0647b9ab21d6f7cd081453, State: Initialized, Role: FOLLOWER
I20250624 02:16:39.638921 6162 consensus_queue.cc:260] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 205, Last appended: 1.205, Last appended by leader: 205, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:39.641667 6162 ts_tablet_manager.cc:1428] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Time spent starting tablet: real 0.012s user 0.014s sys 0.000s
W20250624 02:16:39.997119 31915 scanner-internal.cc:458] Time spent opening tablet: real 2.337s user 0.007s sys 0.002s
I20250624 02:16:40.026371 5952 raft_consensus.cc:1273] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 2 FOLLOWER]: Refusing update from remote peer 2b0d9569e0d84ee097265b983b992ff9: Log matching property violated. Preceding OpId in replica: term: 1 index: 205. Preceding OpId from leader: term: 2 index: 206. (index mismatch)
I20250624 02:16:40.028065 6288 consensus_queue.cc:1035] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 206, Last known committed idx: 205, Time since last communication: 0.000s
I20250624 02:16:40.057588 6224 raft_consensus.cc:3058] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Advancing to term 2
I20250624 02:16:40.069389 6224 raft_consensus.cc:1273] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 2 FOLLOWER]: Refusing update from remote peer 2b0d9569e0d84ee097265b983b992ff9: Log matching property violated. Preceding OpId in replica: term: 1 index: 205. Preceding OpId from leader: term: 2 index: 206. (index mismatch)
I20250624 02:16:40.071985 6306 consensus_queue.cc:1035] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 206, Last known committed idx: 205, Time since last communication: 0.001s
W20250624 02:16:40.082810 5993 debug-util.cc:398] Leaking SignalData structure 0x7b08000aea20 after lost signal to thread 5868
I20250624 02:16:40.142388 6088 consensus_queue.cc:237] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 206, Committed index: 206, Last appended: 2.206, Last appended by leader: 205, Current term: 2, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: 207 OBSOLETE_local: false peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:40.148470 5952 raft_consensus.cc:1273] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 2 FOLLOWER]: Refusing update from remote peer 2b0d9569e0d84ee097265b983b992ff9: Log matching property violated. Preceding OpId in replica: term: 2 index: 206. Preceding OpId from leader: term: 2 index: 207. (index mismatch)
I20250624 02:16:40.150110 6306 consensus_queue.cc:1035] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 207, Last known committed idx: 206, Time since last communication: 0.000s
I20250624 02:16:40.157560 6306 raft_consensus.cc:2953] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 LEADER]: Committing config change with OpId 2.207: config changed from index -1 to 207, VOTER 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195) evicted. New config: { opid_index: 207 OBSOLETE_local: false peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } } }
I20250624 02:16:40.160804 5952 raft_consensus.cc:2953] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 2 FOLLOWER]: Committing config change with OpId 2.207: config changed from index -1 to 207, VOTER 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195) evicted. New config: { opid_index: 207 OBSOLETE_local: false peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } } }
I20250624 02:16:40.182140 5796 catalog_manager.cc:5095] ChangeConfig:REMOVE_PEER RPC for tablet 0cd08e8911a147fbb51dbb8d39ce6261 with cas_config_opid_index -1: ChangeConfig:REMOVE_PEER succeeded (attempt 1)
I20250624 02:16:40.187448 5810 catalog_manager.cc:5582] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 reported cstate change: config changed from index -1 to 207, VOTER 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195) evicted. New cstate: current_term: 2 leader_uuid: "2b0d9569e0d84ee097265b983b992ff9" committed_config { opid_index: 207 OBSOLETE_local: false peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } health_report { overall_health: HEALTHY } } }
I20250624 02:16:40.221035 6088 consensus_queue.cc:237] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 207, Committed index: 207, Last appended: 2.207, Last appended by leader: 205, Current term: 2, Majority size: 1, State: 0, Mode: LEADER, active raft config: opid_index: 208 OBSOLETE_local: false peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:40.225495 6306 raft_consensus.cc:2953] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 LEADER]: Committing config change with OpId 2.208: config changed from index 207 to 208, VOTER 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193) evicted. New config: { opid_index: 208 OBSOLETE_local: false peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } } }
I20250624 02:16:40.234556 5796 catalog_manager.cc:5095] ChangeConfig:REMOVE_PEER RPC for tablet 0cd08e8911a147fbb51dbb8d39ce6261 with cas_config_opid_index 207: ChangeConfig:REMOVE_PEER succeeded (attempt 1)
I20250624 02:16:40.239324 5809 catalog_manager.cc:5582] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 reported cstate change: config changed from index 207 to 208, VOTER 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193) evicted. New cstate: current_term: 2 leader_uuid: "2b0d9569e0d84ee097265b983b992ff9" committed_config { opid_index: 208 OBSOLETE_local: false peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } health_report { overall_health: HEALTHY } } }
I20250624 02:16:40.239090 6204 tablet_service.cc:1515] Processing DeleteTablet for tablet 0cd08e8911a147fbb51dbb8d39ce6261 with delete_type TABLET_DATA_TOMBSTONED (TS 1b47d4950d0647b9ab21d6f7cd081453 not found in new config with opid_index 207) from {username='slave'} at 127.0.0.1:46624
I20250624 02:16:40.253129 6316 tablet_replica.cc:331] stopping tablet replica
I20250624 02:16:40.254143 6316 raft_consensus.cc:2241] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 2 FOLLOWER]: Raft consensus shutting down.
I20250624 02:16:40.254943 6316 raft_consensus.cc:2270] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453 [term 2 FOLLOWER]: Raft consensus is shut down!
I20250624 02:16:40.273447 5932 tablet_service.cc:1515] Processing DeleteTablet for tablet 0cd08e8911a147fbb51dbb8d39ce6261 with delete_type TABLET_DATA_TOMBSTONED (TS 0971d3ff884d42dc87eb43e864ed86ca not found in new config with opid_index 208) from {username='slave'} at 127.0.0.1:55392
I20250624 02:16:40.283227 6318 tablet_replica.cc:331] stopping tablet replica
I20250624 02:16:40.294626 6318 raft_consensus.cc:2241] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 2 FOLLOWER]: Raft consensus shutting down.
I20250624 02:16:40.295176 6316 ts_tablet_manager.cc:1905] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Deleting tablet data with delete state TABLET_DATA_TOMBSTONED
I20250624 02:16:40.295436 6318 raft_consensus.cc:2270] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca [term 2 FOLLOWER]: Raft consensus is shut down!
I20250624 02:16:40.316949 6316 ts_tablet_manager.cc:1918] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: tablet deleted with delete type TABLET_DATA_TOMBSTONED: last-logged OpId 2.206
I20250624 02:16:40.317518 6316 log.cc:1199] T 0cd08e8911a147fbb51dbb8d39ce6261 P 1b47d4950d0647b9ab21d6f7cd081453: Deleting WAL directory at /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/wal/wals/0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:40.319408 5796 catalog_manager.cc:4928] TS 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195:35639): tablet 0cd08e8911a147fbb51dbb8d39ce6261 (table pre_rebuild [id=3e796535e3dc4e5294e87050d8c55a7c]) successfully deleted
I20250624 02:16:40.339175 6318 ts_tablet_manager.cc:1905] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Deleting tablet data with delete state TABLET_DATA_TOMBSTONED
I20250624 02:16:40.355191 6318 ts_tablet_manager.cc:1918] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: tablet deleted with delete type TABLET_DATA_TOMBSTONED: last-logged OpId 2.207
I20250624 02:16:40.355762 6318 log.cc:1199] T 0cd08e8911a147fbb51dbb8d39ce6261 P 0971d3ff884d42dc87eb43e864ed86ca: Deleting WAL directory at /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/wal/wals/0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:40.357774 5797 catalog_manager.cc:4928] TS 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353): tablet 0cd08e8911a147fbb51dbb8d39ce6261 (table pre_rebuild [id=3e796535e3dc4e5294e87050d8c55a7c]) successfully deleted
I20250624 02:16:41.098937 5932 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:41.109721 6204 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:41.127333 6068 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
Master Summary
UUID | Address | Status
----------------------------------+---------------------+---------
7d986e92960341c6a16aa83dde4637c6 | 127.31.42.254:34813 | HEALTHY
Unusual flags for Master:
Flag | Value | Tags | Master
----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------+-------------------------
ipki_ca_key_size | 768 | experimental | all 1 server(s) checked
ipki_server_key_size | 768 | experimental | all 1 server(s) checked
never_fsync | true | unsafe,advanced | all 1 server(s) checked
openssl_security_level_override | 0 | unsafe,hidden | all 1 server(s) checked
rpc_reuseport | true | experimental | all 1 server(s) checked
rpc_server_allow_ephemeral_ports | true | unsafe | all 1 server(s) checked
server_dump_info_format | pb | hidden | all 1 server(s) checked
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb | hidden | all 1 server(s) checked
tsk_num_rsa_bits | 512 | experimental | all 1 server(s) checked
Flags of checked categories for Master:
Flag | Value | Master
---------------------+---------------------+-------------------------
builtin_ntp_servers | 127.31.42.212:39925 | all 1 server(s) checked
time_source | builtin | all 1 server(s) checked
Tablet Server Summary
UUID | Address | Status | Location | Tablet Leaders | Active Scanners
----------------------------------+---------------------+---------+----------+----------------+-----------------
0971d3ff884d42dc87eb43e864ed86ca | 127.31.42.193:39353 | HEALTHY | <none> | 0 | 0
1b47d4950d0647b9ab21d6f7cd081453 | 127.31.42.195:35639 | HEALTHY | <none> | 0 | 0
2b0d9569e0d84ee097265b983b992ff9 | 127.31.42.194:40181 | HEALTHY | <none> | 1 | 0
Tablet Server Location Summary
Location | Count
----------+---------
<none> | 3
Unusual flags for Tablet Server:
Flag | Value | Tags | Tablet Server
----------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------+-------------------------
ipki_server_key_size | 768 | experimental | all 3 server(s) checked
local_ip_for_outbound_sockets | 127.31.42.193 | experimental | 127.31.42.193:39353
local_ip_for_outbound_sockets | 127.31.42.194 | experimental | 127.31.42.194:40181
local_ip_for_outbound_sockets | 127.31.42.195 | experimental | 127.31.42.195:35639
never_fsync | true | unsafe,advanced | all 3 server(s) checked
openssl_security_level_override | 0 | unsafe,hidden | all 3 server(s) checked
rpc_server_allow_ephemeral_ports | true | unsafe | all 3 server(s) checked
server_dump_info_format | pb | hidden | all 3 server(s) checked
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb | hidden | 127.31.42.193:39353
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb | hidden | 127.31.42.194:40181
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb | hidden | 127.31.42.195:35639
Flags of checked categories for Tablet Server:
Flag | Value | Tablet Server
---------------------+---------------------+-------------------------
builtin_ntp_servers | 127.31.42.212:39925 | all 3 server(s) checked
time_source | builtin | all 3 server(s) checked
Version Summary
Version | Servers
-----------------+-------------------------
1.18.0-SNAPSHOT | all 4 server(s) checked
Tablet Summary
The cluster doesn't have any matching system tables
Summary by table
Name | RF | Status | Total Tablets | Healthy | Recovering | Under-replicated | Unavailable
-------------+----+---------+---------------+---------+------------+------------------+-------------
pre_rebuild | 1 | HEALTHY | 1 | 1 | 0 | 0 | 0
Tablet Replica Count Summary
Statistic | Replica Count
----------------+---------------
Minimum | 0
First Quartile | 0
Median | 0
Third Quartile | 1
Maximum | 1
Total Count Summary
| Total Count
----------------+-------------
Masters | 1
Tablet Servers | 3
Tables | 1
Tablets | 1
Replicas | 1
==================
Warnings:
==================
Some masters have unsafe, experimental, or hidden flags set
Some tablet servers have unsafe, experimental, or hidden flags set
OK
I20250624 02:16:41.523262 31915 log_verifier.cc:126] Checking tablet 0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:41.851610 31915 log_verifier.cc:177] Verified matching terms for 208 ops in tablet 0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:41.853989 5809 catalog_manager.cc:2482] Servicing SoftDeleteTable request from {username='slave'} at 127.0.0.1:38060:
table { table_name: "pre_rebuild" } modify_external_catalogs: true
I20250624 02:16:41.854494 5809 catalog_manager.cc:2730] Servicing DeleteTable request from {username='slave'} at 127.0.0.1:38060:
table { table_name: "pre_rebuild" } modify_external_catalogs: true
I20250624 02:16:41.866612 5809 catalog_manager.cc:5869] T 00000000000000000000000000000000 P 7d986e92960341c6a16aa83dde4637c6: Sending DeleteTablet for 1 replicas of tablet 0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:41.868587 31915 test_util.cc:276] Using random seed: -347196146
I20250624 02:16:41.868705 6068 tablet_service.cc:1515] Processing DeleteTablet for tablet 0cd08e8911a147fbb51dbb8d39ce6261 with delete_type TABLET_DATA_DELETED (Table deleted at 2025-06-24 02:16:41 UTC) from {username='slave'} at 127.0.0.1:37008
I20250624 02:16:41.879375 6350 tablet_replica.cc:331] stopping tablet replica
I20250624 02:16:41.880784 6350 raft_consensus.cc:2241] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 LEADER]: Raft consensus shutting down.
I20250624 02:16:41.881750 6350 raft_consensus.cc:2270] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9 [term 2 FOLLOWER]: Raft consensus is shut down!
I20250624 02:16:41.930410 6350 ts_tablet_manager.cc:1905] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Deleting tablet data with delete state TABLET_DATA_DELETED
I20250624 02:16:41.948588 6350 ts_tablet_manager.cc:1918] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: tablet deleted with delete type TABLET_DATA_DELETED: last-logged OpId 2.208
I20250624 02:16:41.949123 6350 log.cc:1199] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Deleting WAL directory at /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/wal/wals/0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:41.950471 6350 ts_tablet_manager.cc:1939] T 0cd08e8911a147fbb51dbb8d39ce6261 P 2b0d9569e0d84ee097265b983b992ff9: Deleting consensus metadata
I20250624 02:16:41.954705 5796 catalog_manager.cc:4928] TS 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181): tablet 0cd08e8911a147fbb51dbb8d39ce6261 (table pre_rebuild [id=3e796535e3dc4e5294e87050d8c55a7c]) successfully deleted
I20250624 02:16:41.959237 5809 catalog_manager.cc:2232] Servicing CreateTable request from {username='slave'} at 127.0.0.1:38092:
name: "post_rebuild"
schema {
columns {
name: "key"
type: INT32
is_key: true
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "int_val"
type: INT32
is_key: false
is_nullable: false
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
columns {
name: "string_val"
type: STRING
is_key: false
is_nullable: true
encoding: AUTO_ENCODING
compression: DEFAULT_COMPRESSION
cfile_block_size: 0
immutable: false
}
}
num_replicas: 3
split_rows_range_bounds {
}
partition_schema {
range_schema {
columns {
name: "key"
}
}
}
W20250624 02:16:41.961797 5809 catalog_manager.cc:6944] The number of live tablet servers is not enough to re-replicate a tablet replica of the newly created table post_rebuild in case of a server failure: 4 tablet servers would be needed, 3 are available. Consider bringing up more tablet servers.
I20250624 02:16:41.987362 5932 tablet_service.cc:1468] Processing CreateTablet for tablet a4e0b888dd06491c9abbd5c20bddd98a (DEFAULT_TABLE table=post_rebuild [id=d474878276674bfaa08bbb10d9377d47]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:41.989032 5932 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a4e0b888dd06491c9abbd5c20bddd98a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:41.989084 6204 tablet_service.cc:1468] Processing CreateTablet for tablet a4e0b888dd06491c9abbd5c20bddd98a (DEFAULT_TABLE table=post_rebuild [id=d474878276674bfaa08bbb10d9377d47]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:41.990437 6204 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a4e0b888dd06491c9abbd5c20bddd98a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:42.000161 6068 tablet_service.cc:1468] Processing CreateTablet for tablet a4e0b888dd06491c9abbd5c20bddd98a (DEFAULT_TABLE table=post_rebuild [id=d474878276674bfaa08bbb10d9377d47]), partition=RANGE (key) PARTITION UNBOUNDED
I20250624 02:16:42.001631 6068 data_dirs.cc:400] Could only allocate 1 dirs of requested 3 for tablet a4e0b888dd06491c9abbd5c20bddd98a. 1 dirs total, 0 dirs full, 0 dirs failed
I20250624 02:16:42.029304 6357 tablet_bootstrap.cc:492] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453: Bootstrap starting.
I20250624 02:16:42.033401 6358 tablet_bootstrap.cc:492] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca: Bootstrap starting.
I20250624 02:16:42.039224 6359 tablet_bootstrap.cc:492] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9: Bootstrap starting.
I20250624 02:16:42.040292 6358 tablet_bootstrap.cc:654] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:42.041402 6357 tablet_bootstrap.cc:654] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:42.045522 6359 tablet_bootstrap.cc:654] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9: Neither blocks nor log segments found. Creating new log.
I20250624 02:16:42.046797 6357 tablet_bootstrap.cc:492] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453: No bootstrap required, opened a new log
I20250624 02:16:42.047364 6357 ts_tablet_manager.cc:1397] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453: Time spent bootstrapping tablet: real 0.018s user 0.007s sys 0.005s
I20250624 02:16:42.050416 6357 raft_consensus.cc:357] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.051403 6357 raft_consensus.cc:383] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:42.051829 6357 raft_consensus.cc:738] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 1b47d4950d0647b9ab21d6f7cd081453, State: Initialized, Role: FOLLOWER
I20250624 02:16:42.052896 6357 consensus_queue.cc:260] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.064689 6358 tablet_bootstrap.cc:492] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca: No bootstrap required, opened a new log
I20250624 02:16:42.066440 6359 tablet_bootstrap.cc:492] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9: No bootstrap required, opened a new log
I20250624 02:16:42.067653 6359 ts_tablet_manager.cc:1397] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9: Time spent bootstrapping tablet: real 0.029s user 0.008s sys 0.004s
I20250624 02:16:42.068181 6358 ts_tablet_manager.cc:1397] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca: Time spent bootstrapping tablet: real 0.035s user 0.010s sys 0.011s
I20250624 02:16:42.068508 6357 ts_tablet_manager.cc:1428] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453: Time spent starting tablet: real 0.021s user 0.008s sys 0.011s
I20250624 02:16:42.072528 6358 raft_consensus.cc:357] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.073559 6358 raft_consensus.cc:383] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:42.073894 6358 raft_consensus.cc:738] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 0971d3ff884d42dc87eb43e864ed86ca, State: Initialized, Role: FOLLOWER
I20250624 02:16:42.075107 6358 consensus_queue.cc:260] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.083062 6359 raft_consensus.cc:357] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Replica starting. Triggering 0 pending ops. Active config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.083885 6359 raft_consensus.cc:383] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Consensus starting up: Expiring failure detector timer to make a prompt election more likely
I20250624 02:16:42.084250 6359 raft_consensus.cc:738] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Becoming Follower/Learner. State: Replica: 2b0d9569e0d84ee097265b983b992ff9, State: Initialized, Role: FOLLOWER
I20250624 02:16:42.084990 6359 consensus_queue.cc:260] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [NON_LEADER]: Queue going to NON_LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 0, Majority size: -1, State: 0, Mode: NON_LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.088842 6358 ts_tablet_manager.cc:1428] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca: Time spent starting tablet: real 0.020s user 0.011s sys 0.007s
I20250624 02:16:42.092325 6359 ts_tablet_manager.cc:1428] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9: Time spent starting tablet: real 0.024s user 0.002s sys 0.006s
W20250624 02:16:42.093380 5998 tablet.cc:2378] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:16:42.143289 6363 raft_consensus.cc:491] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Starting pre-election (no leader contacted us within the election timeout)
I20250624 02:16:42.143860 6363 raft_consensus.cc:513] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Starting pre-election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.148278 6363 leader_election.cc:290] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [CANDIDATE]: Term 1 pre-election: Requested pre-vote from peers 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353), 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:42.166939 6088 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a4e0b888dd06491c9abbd5c20bddd98a" candidate_uuid: "1b47d4950d0647b9ab21d6f7cd081453" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "2b0d9569e0d84ee097265b983b992ff9" is_pre_election: true
I20250624 02:16:42.166882 5952 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a4e0b888dd06491c9abbd5c20bddd98a" candidate_uuid: "1b47d4950d0647b9ab21d6f7cd081453" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0971d3ff884d42dc87eb43e864ed86ca" is_pre_election: true
I20250624 02:16:42.167591 6088 raft_consensus.cc:2466] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1b47d4950d0647b9ab21d6f7cd081453 in term 0.
I20250624 02:16:42.167584 5952 raft_consensus.cc:2466] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Leader pre-election vote request: Granting yes vote for candidate 1b47d4950d0647b9ab21d6f7cd081453 in term 0.
I20250624 02:16:42.169052 6159 leader_election.cc:304] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [CANDIDATE]: Term 1 pre-election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 0971d3ff884d42dc87eb43e864ed86ca, 1b47d4950d0647b9ab21d6f7cd081453; no voters:
I20250624 02:16:42.169981 6363 raft_consensus.cc:2802] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Leader pre-election won for term 1
I20250624 02:16:42.170331 6363 raft_consensus.cc:491] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Starting leader election (no leader contacted us within the election timeout)
I20250624 02:16:42.170624 6363 raft_consensus.cc:3058] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:42.177678 6363 raft_consensus.cc:513] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Starting leader election with config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.180545 5952 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a4e0b888dd06491c9abbd5c20bddd98a" candidate_uuid: "1b47d4950d0647b9ab21d6f7cd081453" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "0971d3ff884d42dc87eb43e864ed86ca"
I20250624 02:16:42.180620 6088 tablet_service.cc:1813] Received RequestConsensusVote() RPC: tablet_id: "a4e0b888dd06491c9abbd5c20bddd98a" candidate_uuid: "1b47d4950d0647b9ab21d6f7cd081453" candidate_term: 1 candidate_status { last_received { term: 0 index: 0 } } ignore_live_leader: false dest_uuid: "2b0d9569e0d84ee097265b983b992ff9"
I20250624 02:16:42.181145 6088 raft_consensus.cc:3058] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:42.181166 5952 raft_consensus.cc:3058] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 0 FOLLOWER]: Advancing to term 1
I20250624 02:16:42.187839 6088 raft_consensus.cc:2466] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1b47d4950d0647b9ab21d6f7cd081453 in term 1.
I20250624 02:16:42.189064 6158 leader_election.cc:304] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [CANDIDATE]: Term 1 election: Election decided. Result: candidate won. Election summary: received 2 responses out of 3 voters: 2 yes votes; 0 no votes. yes voters: 1b47d4950d0647b9ab21d6f7cd081453, 2b0d9569e0d84ee097265b983b992ff9; no voters:
I20250624 02:16:42.189175 5952 raft_consensus.cc:2466] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Leader election vote request: Granting yes vote for candidate 1b47d4950d0647b9ab21d6f7cd081453 in term 1.
I20250624 02:16:42.191481 6363 leader_election.cc:290] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [CANDIDATE]: Term 1 election: Requested vote from peers 0971d3ff884d42dc87eb43e864ed86ca (127.31.42.193:39353), 2b0d9569e0d84ee097265b983b992ff9 (127.31.42.194:40181)
I20250624 02:16:42.191671 6370 raft_consensus.cc:2802] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 FOLLOWER]: Leader election won for term 1
I20250624 02:16:42.192302 6370 raft_consensus.cc:695] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [term 1 LEADER]: Becoming Leader. State: Replica: 1b47d4950d0647b9ab21d6f7cd081453, State: Running, Role: LEADER
I20250624 02:16:42.193374 6370 consensus_queue.cc:237] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [LEADER]: Queue going to LEADER mode. State: All replicated index: 0, Majority replicated index: 0, Committed index: 0, Last appended: 0.0, Last appended by leader: 0, Current term: 1, Majority size: 2, State: 0, Mode: LEADER, active raft config: opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } }
I20250624 02:16:42.203375 5809 catalog_manager.cc:5582] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 reported cstate change: term changed from 0 to 1, leader changed from <none> to 1b47d4950d0647b9ab21d6f7cd081453 (127.31.42.195). New cstate: current_term: 1 leader_uuid: "1b47d4950d0647b9ab21d6f7cd081453" committed_config { opid_index: -1 OBSOLETE_local: false peers { permanent_uuid: "1b47d4950d0647b9ab21d6f7cd081453" member_type: VOTER last_known_addr { host: "127.31.42.195" port: 35639 } health_report { overall_health: HEALTHY } } peers { permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 } health_report { overall_health: UNKNOWN } } peers { permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 } health_report { overall_health: UNKNOWN } } }
W20250624 02:16:42.307869 6270 tablet.cc:2378] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453: Can't schedule compaction. Clean time has not been advanced past its initial value.
W20250624 02:16:42.325345 6134 tablet.cc:2378] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9: Can't schedule compaction. Clean time has not been advanced past its initial value.
I20250624 02:16:42.417124 5952 raft_consensus.cc:1273] T a4e0b888dd06491c9abbd5c20bddd98a P 0971d3ff884d42dc87eb43e864ed86ca [term 1 FOLLOWER]: Refusing update from remote peer 1b47d4950d0647b9ab21d6f7cd081453: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 02:16:42.417244 6088 raft_consensus.cc:1273] T a4e0b888dd06491c9abbd5c20bddd98a P 2b0d9569e0d84ee097265b983b992ff9 [term 1 FOLLOWER]: Refusing update from remote peer 1b47d4950d0647b9ab21d6f7cd081453: Log matching property violated. Preceding OpId in replica: term: 0 index: 0. Preceding OpId from leader: term: 1 index: 2. (index mismatch)
I20250624 02:16:42.418673 6363 consensus_queue.cc:1035] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [LEADER]: Connected to new peer: Peer: permanent_uuid: "2b0d9569e0d84ee097265b983b992ff9" member_type: VOTER last_known_addr { host: "127.31.42.194" port: 40181 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:16:42.419420 6370 consensus_queue.cc:1035] T a4e0b888dd06491c9abbd5c20bddd98a P 1b47d4950d0647b9ab21d6f7cd081453 [LEADER]: Connected to new peer: Peer: permanent_uuid: "0971d3ff884d42dc87eb43e864ed86ca" member_type: VOTER last_known_addr { host: "127.31.42.193" port: 39353 }, Status: LMP_MISMATCH, Last received: 0.0, Next index: 1, Last known committed idx: 0, Time since last communication: 0.000s
I20250624 02:16:42.460371 6378 mvcc.cc:204] Tried to move back new op lower bound from 7170995824281178112 to 7170995823394701312. Current Snapshot: MvccSnapshot[applied={T|T < 7170995824281178112}]
I20250624 02:16:42.480813 6379 mvcc.cc:204] Tried to move back new op lower bound from 7170995824281178112 to 7170995823394701312. Current Snapshot: MvccSnapshot[applied={T|T < 7170995824281178112}]
W20250624 02:16:45.575147 6158 outbound_call.cc:321] RPC callback for RPC call kudu.consensus.ConsensusService.UpdateConsensus -> {remote=127.31.42.194:40181, user_credentials={real_user=slave}} blocked reactor thread for 81575.6us
I20250624 02:16:47.842959 5932 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
I20250624 02:16:47.843149 6204 tablet_service.cc:1430] Tablet server has 1 leaders and 0 scanners
I20250624 02:16:47.843907 6068 tablet_service.cc:1430] Tablet server has 0 leaders and 0 scanners
Master Summary
UUID | Address | Status
----------------------------------+---------------------+---------
7d986e92960341c6a16aa83dde4637c6 | 127.31.42.254:34813 | HEALTHY
Unusual flags for Master:
Flag | Value | Tags | Master
----------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------+-------------------------
ipki_ca_key_size | 768 | experimental | all 1 server(s) checked
ipki_server_key_size | 768 | experimental | all 1 server(s) checked
never_fsync | true | unsafe,advanced | all 1 server(s) checked
openssl_security_level_override | 0 | unsafe,hidden | all 1 server(s) checked
rpc_reuseport | true | experimental | all 1 server(s) checked
rpc_server_allow_ephemeral_ports | true | unsafe | all 1 server(s) checked
server_dump_info_format | pb | hidden | all 1 server(s) checked
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/master-0/data/info.pb | hidden | all 1 server(s) checked
tsk_num_rsa_bits | 512 | experimental | all 1 server(s) checked
Flags of checked categories for Master:
Flag | Value | Master
---------------------+---------------------+-------------------------
builtin_ntp_servers | 127.31.42.212:39925 | all 1 server(s) checked
time_source | builtin | all 1 server(s) checked
Tablet Server Summary
UUID | Address | Status | Location | Tablet Leaders | Active Scanners
----------------------------------+---------------------+---------+----------+----------------+-----------------
0971d3ff884d42dc87eb43e864ed86ca | 127.31.42.193:39353 | HEALTHY | <none> | 0 | 0
1b47d4950d0647b9ab21d6f7cd081453 | 127.31.42.195:35639 | HEALTHY | <none> | 1 | 0
2b0d9569e0d84ee097265b983b992ff9 | 127.31.42.194:40181 | HEALTHY | <none> | 0 | 0
Tablet Server Location Summary
Location | Count
----------+---------
<none> | 3
Unusual flags for Tablet Server:
Flag | Value | Tags | Tablet Server
----------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+-----------------+-------------------------
ipki_server_key_size | 768 | experimental | all 3 server(s) checked
local_ip_for_outbound_sockets | 127.31.42.193 | experimental | 127.31.42.193:39353
local_ip_for_outbound_sockets | 127.31.42.194 | experimental | 127.31.42.194:40181
local_ip_for_outbound_sockets | 127.31.42.195 | experimental | 127.31.42.195:35639
never_fsync | true | unsafe,advanced | all 3 server(s) checked
openssl_security_level_override | 0 | unsafe,hidden | all 3 server(s) checked
rpc_server_allow_ephemeral_ports | true | unsafe | all 3 server(s) checked
server_dump_info_format | pb | hidden | all 3 server(s) checked
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-0/data/info.pb | hidden | 127.31.42.193:39353
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-1/data/info.pb | hidden | 127.31.42.194:40181
server_dump_info_path | /tmp/dist-test-taskqm4Kdf/test-tmp/kudu-admin-test.5.IsSecure_SecureClusterAdminCliParamTest.TestRebuildMaster_0.1750731222455155-31915-0/minicluster-data/ts-2/data/info.pb | hidden | 127.31.42.195:35639
Flags of checked categories for Tablet Server:
Flag | Value | Tablet Server
---------------------+---------------------+-------------------------
builtin_ntp_servers | 127.31.42.212:39925 | all 3 server(s) checked
time_source | builtin | all 3 server(s) checked
Version Summary
Version | Servers
-----------------+-------------------------
1.18.0-SNAPSHOT | all 4 server(s) checked
Tablet Summary
The cluster doesn't have any matching system tables
Summary by table
Name | RF | Status | Total Tablets | Healthy | Recovering | Under-replicated | Unavailable
--------------+----+---------+---------------+---------+------------+------------------+-------------
post_rebuild | 3 | HEALTHY | 1 | 1 | 0 | 0 | 0
Tablet Replica Count Summary
Statistic | Replica Count
----------------+---------------
Minimum | 1
First Quartile | 1
Median | 1
Third Quartile | 1
Maximum | 1
Total Count Summary
| Total Count
----------------+-------------
Masters | 1
Tablet Servers | 3
Tables | 1
Tablets | 1
Replicas | 3
==================
Warnings:
==================
Some masters have unsafe, experimental, or hidden flags set
Some tablet servers have unsafe, experimental, or hidden flags set
OK
I20250624 02:16:48.095531 31915 log_verifier.cc:126] Checking tablet 0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:48.096108 31915 log_verifier.cc:177] Verified matching terms for 0 ops in tablet 0cd08e8911a147fbb51dbb8d39ce6261
I20250624 02:16:48.096284 31915 log_verifier.cc:126] Checking tablet a4e0b888dd06491c9abbd5c20bddd98a
I20250624 02:16:48.853435 31915 log_verifier.cc:177] Verified matching terms for 206 ops in tablet a4e0b888dd06491c9abbd5c20bddd98a
I20250624 02:16:48.880352 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 5847
I20250624 02:16:48.923345 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 6001
I20250624 02:16:48.967657 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 6137
I20250624 02:16:49.018105 31915 external_mini_cluster.cc:1620] Killing /tmp/dist-test-taskqm4Kdf/build/tsan/bin/kudu with pid 5777
2025-06-24T02:16:49Z chronyd exiting
[ OK ] IsSecure/SecureClusterAdminCliParamTest.TestRebuildMaster/0 (36839 ms)
[----------] 1 test from IsSecure/SecureClusterAdminCliParamTest (36839 ms total)
[----------] Global test environment tear-down
[==========] 9 tests from 5 test suites ran. (186557 ms total)
[ PASSED ] 8 tests.
[ FAILED ] 1 test, listed below:
[ FAILED ] AdminCliTest.TestRebuildTables
1 FAILED TEST
I20250624 02:16:49.089565 31915 logging.cc:424] LogThrottler /home/jenkins-slave/workspace/build_and_test_flaky/src/kudu/client/meta_cache.cc:302: suppressed but not reported on 3 messages since previous log ~49 seconds ago